12180 1727204051.64538: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-G1p executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 12180 1727204051.65500: Added group all to inventory 12180 1727204051.65502: Added group ungrouped to inventory 12180 1727204051.65506: Group all now contains ungrouped 12180 1727204051.65510: Examining possible inventory source: /tmp/network-M6W/inventory-5vW.yml 12180 1727204051.90356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 12180 1727204051.90499: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 12180 1727204051.90585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 12180 1727204051.90759: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 12180 1727204051.90850: Loaded config def from plugin (inventory/script) 12180 1727204051.90852: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 12180 1727204051.90897: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 12180 1727204051.91003: Loaded config def from plugin (inventory/yaml) 12180 1727204051.91005: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 12180 1727204051.91296: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 12180 1727204051.93459: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 12180 1727204051.93466: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 12180 1727204051.93470: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 12180 1727204051.93477: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 12180 1727204051.93487: Loading data from /tmp/network-M6W/inventory-5vW.yml 12180 1727204051.93567: /tmp/network-M6W/inventory-5vW.yml was not parsable by auto 12180 1727204051.93948: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 12180 1727204051.94007: Loading data from /tmp/network-M6W/inventory-5vW.yml 12180 1727204051.94439: group all already in inventory 12180 1727204051.94446: set inventory_file for managed-node1 12180 1727204051.94450: set inventory_dir for managed-node1 12180 1727204051.94451: Added host managed-node1 to inventory 12180 1727204051.94454: Added host managed-node1 to group all 12180 1727204051.94455: set ansible_host for managed-node1 12180 1727204051.94456: set ansible_ssh_extra_args for managed-node1 12180 1727204051.94459: set inventory_file for managed-node2 12180 1727204051.94461: set inventory_dir for managed-node2 12180 1727204051.94462: Added host managed-node2 to inventory 12180 1727204051.94466: Added host managed-node2 to group all 12180 1727204051.94467: set ansible_host for managed-node2 12180 1727204051.94467: set ansible_ssh_extra_args for managed-node2 12180 1727204051.94470: set inventory_file for managed-node3 12180 1727204051.94473: set inventory_dir for managed-node3 12180 1727204051.94474: Added host managed-node3 to inventory 12180 1727204051.94475: Added host managed-node3 to group all 12180 1727204051.94476: set ansible_host for managed-node3 12180 1727204051.94476: set ansible_ssh_extra_args for managed-node3 12180 1727204051.94480: Reconcile groups and hosts in inventory. 12180 1727204051.94484: Group ungrouped now contains managed-node1 12180 1727204051.94485: Group ungrouped now contains managed-node2 12180 1727204051.94487: Group ungrouped now contains managed-node3 12180 1727204051.94582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 12180 1727204051.94781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 12180 1727204051.94834: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 12180 1727204051.94872: Loaded config def from plugin (vars/host_group_vars) 12180 1727204051.94875: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 12180 1727204051.94883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 12180 1727204051.94891: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 12180 1727204051.94939: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 12180 1727204051.95392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204051.95497: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 12180 1727204051.95540: Loaded config def from plugin (connection/local) 12180 1727204051.95544: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 12180 1727204051.96379: Loaded config def from plugin (connection/paramiko_ssh) 12180 1727204051.96383: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 12180 1727204051.98691: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12180 1727204051.98734: Loaded config def from plugin (connection/psrp) 12180 1727204051.98738: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 12180 1727204052.00473: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12180 1727204052.00518: Loaded config def from plugin (connection/ssh) 12180 1727204052.00522: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 12180 1727204052.01255: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12180 1727204052.01315: Loaded config def from plugin (connection/winrm) 12180 1727204052.01319: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 12180 1727204052.01355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 12180 1727204052.01437: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 12180 1727204052.01507: Loaded config def from plugin (shell/cmd) 12180 1727204052.01509: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 12180 1727204052.01544: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 12180 1727204052.01613: Loaded config def from plugin (shell/powershell) 12180 1727204052.01615: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 12180 1727204052.02030: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 12180 1727204052.02231: Loaded config def from plugin (shell/sh) 12180 1727204052.02233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 12180 1727204052.02272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 12180 1727204052.02489: Loaded config def from plugin (become/runas) 12180 1727204052.02491: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 12180 1727204052.03149: Loaded config def from plugin (become/su) 12180 1727204052.03152: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 12180 1727204052.03441: Loaded config def from plugin (become/sudo) 12180 1727204052.03444: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 12180 1727204052.03634: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml 12180 1727204052.05339: in VariableManager get_vars() 12180 1727204052.05407: done with get_vars() 12180 1727204052.05688: trying /usr/local/lib/python3.12/site-packages/ansible/modules 12180 1727204052.15858: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 12180 1727204052.16230: in VariableManager get_vars() 12180 1727204052.16236: done with get_vars() 12180 1727204052.16239: variable 'playbook_dir' from source: magic vars 12180 1727204052.16240: variable 'ansible_playbook_python' from source: magic vars 12180 1727204052.16241: variable 'ansible_config_file' from source: magic vars 12180 1727204052.16241: variable 'groups' from source: magic vars 12180 1727204052.16243: variable 'omit' from source: magic vars 12180 1727204052.16243: variable 'ansible_version' from source: magic vars 12180 1727204052.16244: variable 'ansible_check_mode' from source: magic vars 12180 1727204052.16245: variable 'ansible_diff_mode' from source: magic vars 12180 1727204052.16246: variable 'ansible_forks' from source: magic vars 12180 1727204052.16246: variable 'ansible_inventory_sources' from source: magic vars 12180 1727204052.16247: variable 'ansible_skip_tags' from source: magic vars 12180 1727204052.16248: variable 'ansible_limit' from source: magic vars 12180 1727204052.16249: variable 'ansible_run_tags' from source: magic vars 12180 1727204052.16249: variable 'ansible_verbosity' from source: magic vars 12180 1727204052.16290: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml 12180 1727204052.18781: in VariableManager get_vars() 12180 1727204052.18799: done with get_vars() 12180 1727204052.18836: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 12180 1727204052.21713: in VariableManager get_vars() 12180 1727204052.21755: done with get_vars() 12180 1727204052.21770: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12180 1727204052.22315: in VariableManager get_vars() 12180 1727204052.22419: done with get_vars() 12180 1727204052.22861: in VariableManager get_vars() 12180 1727204052.22980: done with get_vars() 12180 1727204052.23035: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12180 1727204052.24311: in VariableManager get_vars() 12180 1727204052.24332: done with get_vars() 12180 1727204052.25352: in VariableManager get_vars() 12180 1727204052.25370: done with get_vars() 12180 1727204052.25375: variable 'omit' from source: magic vars 12180 1727204052.25408: variable 'omit' from source: magic vars 12180 1727204052.25522: in VariableManager get_vars() 12180 1727204052.25544: done with get_vars() 12180 1727204052.25594: in VariableManager get_vars() 12180 1727204052.25761: done with get_vars() 12180 1727204052.25813: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12180 1727204052.26387: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12180 1727204052.26810: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12180 1727204052.28662: in VariableManager get_vars() 12180 1727204052.28690: done with get_vars() 12180 1727204052.29929: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 12180 1727204052.30281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12180 1727204052.35279: in VariableManager get_vars() 12180 1727204052.35485: done with get_vars() 12180 1727204052.35496: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12180 1727204052.35704: in VariableManager get_vars() 12180 1727204052.35725: done with get_vars() 12180 1727204052.35962: in VariableManager get_vars() 12180 1727204052.35982: done with get_vars() 12180 1727204052.36912: in VariableManager get_vars() 12180 1727204052.36931: done with get_vars() 12180 1727204052.36937: variable 'omit' from source: magic vars 12180 1727204052.36962: variable 'omit' from source: magic vars 12180 1727204052.37121: in VariableManager get_vars() 12180 1727204052.37137: done with get_vars() 12180 1727204052.37158: in VariableManager get_vars() 12180 1727204052.37175: done with get_vars() 12180 1727204052.37292: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12180 1727204052.37576: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12180 1727204052.37776: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12180 1727204052.38813: in VariableManager get_vars() 12180 1727204052.38974: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12180 1727204052.45334: in VariableManager get_vars() 12180 1727204052.45358: done with get_vars() 12180 1727204052.45416: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 12180 1727204052.46982: in VariableManager get_vars() 12180 1727204052.47005: done with get_vars() 12180 1727204052.47183: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 12180 1727204052.47198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 12180 1727204052.47798: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 12180 1727204052.48084: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 12180 1727204052.48087: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 12180 1727204052.48169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 12180 1727204052.48196: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 12180 1727204052.48601: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 12180 1727204052.48773: Loaded config def from plugin (callback/default) 12180 1727204052.48775: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 12180 1727204052.58184: Loaded config def from plugin (callback/junit) 12180 1727204052.58188: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 12180 1727204052.58355: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 12180 1727204052.58419: Loaded config def from plugin (callback/minimal) 12180 1727204052.58421: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 12180 1727204052.58579: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 12180 1727204052.58640: Loaded config def from plugin (callback/tree) 12180 1727204052.58643: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 12180 1727204052.58911: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 12180 1727204052.58914: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_deprecated_nm.yml ***************************************** 2 plays in /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml 12180 1727204052.58941: in VariableManager get_vars() 12180 1727204052.58956: done with get_vars() 12180 1727204052.58962: in VariableManager get_vars() 12180 1727204052.58972: done with get_vars() 12180 1727204052.58976: variable 'omit' from source: magic vars 12180 1727204052.59239: in VariableManager get_vars() 12180 1727204052.59270: done with get_vars() 12180 1727204052.59294: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_deprecated.yml' with nm as provider] *** 12180 1727204052.60635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 12180 1727204052.60715: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 12180 1727204052.61224: getting the remaining hosts for this loop 12180 1727204052.61226: done getting the remaining hosts for this loop 12180 1727204052.61229: getting the next task for host managed-node1 12180 1727204052.61233: done getting next task for host managed-node1 12180 1727204052.61235: ^ task is: TASK: Gathering Facts 12180 1727204052.61237: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204052.61239: getting variables 12180 1727204052.61240: in VariableManager get_vars() 12180 1727204052.61252: Calling all_inventory to load vars for managed-node1 12180 1727204052.61254: Calling groups_inventory to load vars for managed-node1 12180 1727204052.61257: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204052.61273: Calling all_plugins_play to load vars for managed-node1 12180 1727204052.61288: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204052.61292: Calling groups_plugins_play to load vars for managed-node1 12180 1727204052.61325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204052.61382: done with get_vars() 12180 1727204052.61394: done getting variables 12180 1727204052.61463: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:6 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.026) 0:00:00.026 ***** 12180 1727204052.61489: entering _queue_task() for managed-node1/gather_facts 12180 1727204052.61495: Creating lock for gather_facts 12180 1727204052.62186: worker is 1 (out of 1 available) 12180 1727204052.62196: exiting _queue_task() for managed-node1/gather_facts 12180 1727204052.62210: done queuing things up, now waiting for results queue to drain 12180 1727204052.62213: waiting for pending results... 12180 1727204052.63597: running TaskExecutor() for managed-node1/TASK: Gathering Facts 12180 1727204052.63929: in run() - task 0affcd87-79f5-ccb1-55ae-0000000000cd 12180 1727204052.63954: variable 'ansible_search_path' from source: unknown 12180 1727204052.64003: calling self._execute() 12180 1727204052.64299: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204052.64312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204052.64331: variable 'omit' from source: magic vars 12180 1727204052.64686: variable 'omit' from source: magic vars 12180 1727204052.64724: variable 'omit' from source: magic vars 12180 1727204052.64947: variable 'omit' from source: magic vars 12180 1727204052.65098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204052.65477: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204052.65645: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204052.65802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204052.65818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204052.65856: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204052.65868: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204052.65881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204052.66206: Set connection var ansible_pipelining to False 12180 1727204052.66215: Set connection var ansible_shell_type to sh 12180 1727204052.66229: Set connection var ansible_timeout to 10 12180 1727204052.66248: Set connection var ansible_connection to ssh 12180 1727204052.66265: Set connection var ansible_shell_executable to /bin/sh 12180 1727204052.66282: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204052.66322: variable 'ansible_shell_executable' from source: unknown 12180 1727204052.66332: variable 'ansible_connection' from source: unknown 12180 1727204052.66339: variable 'ansible_module_compression' from source: unknown 12180 1727204052.66346: variable 'ansible_shell_type' from source: unknown 12180 1727204052.66351: variable 'ansible_shell_executable' from source: unknown 12180 1727204052.66423: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204052.66436: variable 'ansible_pipelining' from source: unknown 12180 1727204052.66443: variable 'ansible_timeout' from source: unknown 12180 1727204052.66450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204052.66865: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204052.66882: variable 'omit' from source: magic vars 12180 1727204052.66891: starting attempt loop 12180 1727204052.66897: running the handler 12180 1727204052.66917: variable 'ansible_facts' from source: unknown 12180 1727204052.66945: _low_level_execute_command(): starting 12180 1727204052.67074: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204052.69947: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204052.69953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204052.69970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204052.70207: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 12180 1727204052.70210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204052.70213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204052.70276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204052.70419: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204052.70482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204052.70561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204052.72293: stdout chunk (state=3): >>>/root <<< 12180 1727204052.72397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204052.72488: stderr chunk (state=3): >>><<< 12180 1727204052.72492: stdout chunk (state=3): >>><<< 12180 1727204052.72618: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204052.72622: _low_level_execute_command(): starting 12180 1727204052.72624: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204052.7251725-12276-43698125584022 `" && echo ansible-tmp-1727204052.7251725-12276-43698125584022="` echo /root/.ansible/tmp/ansible-tmp-1727204052.7251725-12276-43698125584022 `" ) && sleep 0' 12180 1727204052.74831: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204052.74837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204052.74847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204052.74850: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204052.74863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204052.74879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204052.74887: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204052.74894: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204052.74902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204052.74911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204052.74923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204052.74931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204052.74937: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204052.74949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204052.75021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204052.75040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204052.75053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204052.75143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204052.77010: stdout chunk (state=3): >>>ansible-tmp-1727204052.7251725-12276-43698125584022=/root/.ansible/tmp/ansible-tmp-1727204052.7251725-12276-43698125584022 <<< 12180 1727204052.77195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204052.77200: stdout chunk (state=3): >>><<< 12180 1727204052.77203: stderr chunk (state=3): >>><<< 12180 1727204052.77233: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204052.7251725-12276-43698125584022=/root/.ansible/tmp/ansible-tmp-1727204052.7251725-12276-43698125584022 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204052.77268: variable 'ansible_module_compression' from source: unknown 12180 1727204052.77330: ANSIBALLZ: Using generic lock for ansible.legacy.setup 12180 1727204052.77334: ANSIBALLZ: Acquiring lock 12180 1727204052.77336: ANSIBALLZ: Lock acquired: 140650305861680 12180 1727204052.77338: ANSIBALLZ: Creating module 12180 1727204053.66525: ANSIBALLZ: Writing module into payload 12180 1727204053.67448: ANSIBALLZ: Writing module 12180 1727204053.67478: ANSIBALLZ: Renaming module 12180 1727204053.67482: ANSIBALLZ: Done creating module 12180 1727204053.67743: variable 'ansible_facts' from source: unknown 12180 1727204053.67751: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204053.67761: _low_level_execute_command(): starting 12180 1727204053.67771: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 12180 1727204053.69499: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204053.69617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204053.69671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204053.69675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204053.69678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204053.69872: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204053.69875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204053.69878: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204053.69880: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204053.69882: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204053.69884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204053.69886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204053.69887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204053.69889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204053.69891: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204053.69893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204053.69982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204053.69999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204053.70012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204053.70105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204053.71835: stdout chunk (state=3): >>>PLATFORM <<< 12180 1727204053.72044: stdout chunk (state=3): >>>Linux <<< 12180 1727204053.72048: stdout chunk (state=3): >>>FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 12180 1727204053.72261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204053.72266: stdout chunk (state=3): >>><<< 12180 1727204053.72273: stderr chunk (state=3): >>><<< 12180 1727204053.72291: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204053.72302 [managed-node1]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 12180 1727204053.72435: _low_level_execute_command(): starting 12180 1727204053.72438: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 12180 1727204053.73876: Sending initial data 12180 1727204053.73879: Sent initial data (1181 bytes) 12180 1727204053.75186: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204053.75191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204053.75237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204053.75253: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204053.75275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204053.75294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204053.75305: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204053.75315: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204053.75329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204053.75343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204053.75358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204053.75377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204053.75391: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204053.75406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204053.75488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204053.75611: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204053.75626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204053.75931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204053.79751: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 12180 1727204053.80476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204053.80481: stdout chunk (state=3): >>><<< 12180 1727204053.80483: stderr chunk (state=3): >>><<< 12180 1727204053.80486: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204053.80488: variable 'ansible_facts' from source: unknown 12180 1727204053.80490: variable 'ansible_facts' from source: unknown 12180 1727204053.80492: variable 'ansible_module_compression' from source: unknown 12180 1727204053.80494: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12180 1727204053.80496: variable 'ansible_facts' from source: unknown 12180 1727204053.80548: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204052.7251725-12276-43698125584022/AnsiballZ_setup.py 12180 1727204053.81560: Sending initial data 12180 1727204053.81566: Sent initial data (153 bytes) 12180 1727204053.85054: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204053.85127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204053.85145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204053.85166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204053.85215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204053.85285: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204053.85300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204053.85317: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204053.85334: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204053.85346: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204053.85357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204053.85373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204053.85389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204053.85401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204053.85411: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204053.85425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204053.85616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204053.85640: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204053.85660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204053.85747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204053.87488: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204053.87543: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204053.87627: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpd3qxj5eo /root/.ansible/tmp/ansible-tmp-1727204052.7251725-12276-43698125584022/AnsiballZ_setup.py <<< 12180 1727204053.87678: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204053.90886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204053.91094: stderr chunk (state=3): >>><<< 12180 1727204053.91098: stdout chunk (state=3): >>><<< 12180 1727204053.91101: done transferring module to remote 12180 1727204053.91103: _low_level_execute_command(): starting 12180 1727204053.91105: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204052.7251725-12276-43698125584022/ /root/.ansible/tmp/ansible-tmp-1727204052.7251725-12276-43698125584022/AnsiballZ_setup.py && sleep 0' 12180 1727204053.93198: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204053.93375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204053.93387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204053.93399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204053.93493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204053.93500: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204053.93510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204053.93523: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204053.93533: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204053.93541: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204053.93549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204053.93559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204053.94321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204053.94337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204053.94343: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204053.94354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204053.94701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204053.94710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204053.94712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204053.94935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204053.96581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204053.96638: stderr chunk (state=3): >>><<< 12180 1727204053.96641: stdout chunk (state=3): >>><<< 12180 1727204053.96660: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204053.96665: _low_level_execute_command(): starting 12180 1727204053.96670: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204052.7251725-12276-43698125584022/AnsiballZ_setup.py && sleep 0' 12180 1727204053.98179: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204053.98184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204053.98350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204053.98354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204053.98371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204053.98377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204053.98577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204053.98580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204053.98594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204053.98687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204054.00657: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 12180 1727204054.00661: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 12180 1727204054.00723: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 12180 1727204054.00762: stdout chunk (state=3): >>>import 'posix' # <<< 12180 1727204054.00786: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 12180 1727204054.00793: stdout chunk (state=3): >>># installing zipimport hook <<< 12180 1727204054.00836: stdout chunk (state=3): >>>import 'time' # <<< 12180 1727204054.00842: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 12180 1727204054.00890: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 12180 1727204054.00901: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204054.00911: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 12180 1727204054.00934: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 12180 1727204054.00939: stdout chunk (state=3): >>>import '_codecs' # <<< 12180 1727204054.00968: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3852098dc0> <<< 12180 1727204054.01012: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 12180 1727204054.01017: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f385203d3a0> <<< 12180 1727204054.01022: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3852098b20> <<< 12180 1727204054.01051: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 12180 1727204054.01060: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 12180 1727204054.01065: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3852098ac0> <<< 12180 1727204054.01104: stdout chunk (state=3): >>>import '_signal' # <<< 12180 1727204054.01107: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 12180 1727204054.01112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 12180 1727204054.01129: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f385203d490> <<< 12180 1727204054.01168: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 12180 1727204054.01175: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 12180 1727204054.01193: stdout chunk (state=3): >>>import '_abc' # <<< 12180 1727204054.01200: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f385203d940> <<< 12180 1727204054.01223: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f385203d670> <<< 12180 1727204054.01254: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 12180 1727204054.01258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 12180 1727204054.01279: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 12180 1727204054.01306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 12180 1727204054.01313: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 12180 1727204054.01352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 12180 1727204054.01359: stdout chunk (state=3): >>>import '_stat' # <<< 12180 1727204054.01370: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851dcf190> <<< 12180 1727204054.01376: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 12180 1727204054.01401: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 12180 1727204054.01473: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851dcf220> <<< 12180 1727204054.01496: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 12180 1727204054.01501: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 12180 1727204054.01530: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 12180 1727204054.01534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851df2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851dcf940> <<< 12180 1727204054.01562: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3852055880> <<< 12180 1727204054.01589: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 12180 1727204054.01596: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851dc8d90> <<< 12180 1727204054.01654: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py <<< 12180 1727204054.01657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # <<< 12180 1727204054.01663: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851df2d90> <<< 12180 1727204054.01721: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f385203d970> <<< 12180 1727204054.01751: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12180 1727204054.02086: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 12180 1727204054.02100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 12180 1727204054.02112: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 12180 1727204054.02128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 12180 1727204054.02138: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 12180 1727204054.02168: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 12180 1727204054.02180: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 12180 1727204054.02190: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 12180 1727204054.02195: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d6ef10> <<< 12180 1727204054.02253: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d740a0> <<< 12180 1727204054.02282: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 12180 1727204054.02285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 12180 1727204054.02288: stdout chunk (state=3): >>>import '_sre' # <<< 12180 1727204054.02318: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 12180 1727204054.02324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 12180 1727204054.02354: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 12180 1727204054.02450: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d675b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d6f6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d6e3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 12180 1727204054.02562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 12180 1727204054.02669: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851c55e50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c55940> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c55f40> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 12180 1727204054.02693: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c55d90> <<< 12180 1727204054.02726: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 12180 1727204054.02734: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c66100> <<< 12180 1727204054.02741: stdout chunk (state=3): >>>import '_collections' # <<< 12180 1727204054.02788: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d49dc0> import '_functools' # <<< 12180 1727204054.02826: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d426a0> <<< 12180 1727204054.02870: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 12180 1727204054.02876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d55700> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d75eb0> <<< 12180 1727204054.02991: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851c66d00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d492e0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851d55310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d7ba60> <<< 12180 1727204054.03048: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204054.03069: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 12180 1727204054.03084: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c66ee0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c66e20> <<< 12180 1727204054.03180: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c66d90> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 12180 1727204054.03192: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 12180 1727204054.03250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 12180 1727204054.03271: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' <<< 12180 1727204054.03284: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c39400> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 12180 1727204054.03322: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 12180 1727204054.03337: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c394f0> <<< 12180 1727204054.03452: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c6ef70> <<< 12180 1727204054.03542: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c68ac0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c68490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 12180 1727204054.03558: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 12180 1727204054.03650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b6d250> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c24550> <<< 12180 1727204054.03765: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c68f40> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d7b0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 12180 1727204054.03783: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b7fb80> <<< 12180 1727204054.03797: stdout chunk (state=3): >>>import 'errno' # <<< 12180 1727204054.03873: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b7feb0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 12180 1727204054.03885: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b907c0> <<< 12180 1727204054.03982: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b90d00> <<< 12180 1727204054.04001: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204054.04007: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b2a430> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b7ffa0> <<< 12180 1727204054.04090: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b3a310> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b90640> <<< 12180 1727204054.04101: stdout chunk (state=3): >>>import 'pwd' # <<< 12180 1727204054.04203: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b3a3d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c66a60> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 12180 1727204054.04215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 12180 1727204054.04313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b56730> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204054.04329: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b56a00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b567f0> <<< 12180 1727204054.04419: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b568e0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 12180 1727204054.04574: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204054.04593: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b56d30> <<< 12180 1727204054.04609: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b60280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b56970> <<< 12180 1727204054.04638: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b49ac0> <<< 12180 1727204054.04650: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c66640> <<< 12180 1727204054.04743: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 12180 1727204054.04773: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b56b20> <<< 12180 1727204054.04917: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 12180 1727204054.04934: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3851a8f700> <<< 12180 1727204054.05166: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip' <<< 12180 1727204054.05183: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.05267: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.05298: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 12180 1727204054.05316: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.05333: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 12180 1727204054.05348: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.06550: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.07478: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519ce850> <<< 12180 1727204054.07499: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204054.07513: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 12180 1727204054.07542: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 12180 1727204054.07568: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204054.07572: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38519ce160> <<< 12180 1727204054.07601: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519ce280> <<< 12180 1727204054.07653: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519cefa0> <<< 12180 1727204054.07657: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 12180 1727204054.07659: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 12180 1727204054.07706: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519ce4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519cedc0> <<< 12180 1727204054.07709: stdout chunk (state=3): >>>import 'atexit' # <<< 12180 1727204054.07743: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38519ce580> <<< 12180 1727204054.07756: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 12180 1727204054.07778: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 12180 1727204054.07822: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519ce100> <<< 12180 1727204054.07843: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 12180 1727204054.07867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 12180 1727204054.07873: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 12180 1727204054.07912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 12180 1727204054.07915: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 12180 1727204054.07917: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 12180 1727204054.07993: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519a30a0> <<< 12180 1727204054.08035: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851388370> <<< 12180 1727204054.08063: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851388070> <<< 12180 1727204054.08090: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 12180 1727204054.08094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 12180 1727204054.08129: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851388cd0> <<< 12180 1727204054.08142: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519b6dc0> <<< 12180 1727204054.08321: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519b63a0> <<< 12180 1727204054.08328: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 12180 1727204054.08331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 12180 1727204054.08347: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519b6f40> <<< 12180 1727204054.08370: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 12180 1727204054.08380: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 12180 1727204054.08418: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 12180 1727204054.08421: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 12180 1727204054.08470: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 12180 1727204054.08473: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 12180 1727204054.08477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 12180 1727204054.08479: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851a03f40> <<< 12180 1727204054.08539: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519d5d60> <<< 12180 1727204054.08550: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519d5430> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851981af0> <<< 12180 1727204054.08578: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38519d5550> <<< 12180 1727204054.08605: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519d5580> <<< 12180 1727204054.08631: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 12180 1727204054.08646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 12180 1727204054.08663: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 12180 1727204054.08687: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 12180 1727204054.08754: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38513f6fa0> <<< 12180 1727204054.08771: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851a15280> <<< 12180 1727204054.08789: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 12180 1727204054.08801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 12180 1727204054.08841: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204054.08860: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38513f3820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851a15400> <<< 12180 1727204054.08876: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 12180 1727204054.08924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204054.08944: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 12180 1727204054.08994: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851a15c40> <<< 12180 1727204054.09124: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38513f37c0> <<< 12180 1727204054.09209: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204054.09224: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38519ae1c0> <<< 12180 1727204054.09248: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204054.09262: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851a159d0> <<< 12180 1727204054.09286: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204054.09303: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851a15550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851a0e940> <<< 12180 1727204054.09320: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 12180 1727204054.09342: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 12180 1727204054.09358: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 12180 1727204054.09396: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38513e8910> <<< 12180 1727204054.09574: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851926dc0> <<< 12180 1727204054.09595: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38513f2550> <<< 12180 1727204054.09622: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38513e8eb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38513f2970> <<< 12180 1727204054.09647: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12180 1727204054.09659: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 12180 1727204054.09738: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.09807: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.09830: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 12180 1727204054.09845: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 12180 1727204054.09870: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.09959: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.10054: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.10493: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.10957: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 12180 1727204054.10965: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 12180 1727204054.11007: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 12180 1727204054.11011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204054.11068: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f385194f7f0> <<< 12180 1727204054.11122: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 12180 1727204054.11129: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519548b0> <<< 12180 1727204054.11148: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f75940> <<< 12180 1727204054.11196: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 12180 1727204054.11204: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.11209: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.11232: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 12180 1727204054.11345: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.11522: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 12180 1727204054.11526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 12180 1727204054.11534: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f385198c730> <<< 12180 1727204054.11537: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.11889: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.12267: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.12319: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.12388: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 12180 1727204054.12417: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.12452: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 12180 1727204054.12455: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.12507: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.12586: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 12180 1727204054.12592: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.12620: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 12180 1727204054.12623: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.12655: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.12698: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 12180 1727204054.12703: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.12879: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.13076: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 12180 1727204054.13100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 12180 1727204054.13105: stdout chunk (state=3): >>>import '_ast' # <<< 12180 1727204054.13187: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519d12e0> <<< 12180 1727204054.13191: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.13240: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.13333: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 12180 1727204054.13341: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 12180 1727204054.13344: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.13376: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.13430: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 12180 1727204054.13434: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.13456: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.13490: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.13586: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.13648: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 12180 1727204054.13663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204054.13734: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851946880> <<< 12180 1727204054.13812: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850df1550> <<< 12180 1727204054.13860: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py <<< 12180 1727204054.13864: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 12180 1727204054.13909: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.13967: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.13993: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.14025: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 12180 1727204054.14053: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 12180 1727204054.14059: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 12180 1727204054.14096: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 12180 1727204054.14138: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 12180 1727204054.14142: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 12180 1727204054.14219: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851957910> <<< 12180 1727204054.14260: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519a0970> <<< 12180 1727204054.14316: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f385198a850> # destroy ansible.module_utils.distro <<< 12180 1727204054.14325: stdout chunk (state=3): >>>import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 12180 1727204054.14369: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py <<< 12180 1727204054.14373: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 12180 1727204054.14449: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 12180 1727204054.14478: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12180 1727204054.14483: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 12180 1727204054.14536: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.14584: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.14602: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.14614: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.14657: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.14696: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.14720: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.14755: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 12180 1727204054.14779: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.14824: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.14891: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.14917: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.14938: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py <<< 12180 1727204054.14950: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.15090: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.15238: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.15263: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.15326: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204054.15353: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 12180 1727204054.15374: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 12180 1727204054.15397: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850cf6c70> <<< 12180 1727204054.15436: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 12180 1727204054.15461: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 12180 1727204054.15476: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 12180 1727204054.15496: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 12180 1727204054.15520: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 12180 1727204054.15540: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f57a30> <<< 12180 1727204054.15570: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3850f579a0> <<< 12180 1727204054.15621: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f9eb20> <<< 12180 1727204054.15647: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f9e550> <<< 12180 1727204054.15753: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f8a2e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f8a970> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 12180 1727204054.15774: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204054.15798: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3850f3b2b0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f3ba00> <<< 12180 1727204054.15810: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 12180 1727204054.15854: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f3b940> <<< 12180 1727204054.15876: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 12180 1727204054.15896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 12180 1727204054.15919: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3850d570d0> <<< 12180 1727204054.15936: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519423a0> <<< 12180 1727204054.15968: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f8a670> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 12180 1727204054.15989: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available <<< 12180 1727204054.16009: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 12180 1727204054.16024: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.16079: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.16136: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 12180 1727204054.16178: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.16242: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 12180 1727204054.16260: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.16281: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.16298: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 12180 1727204054.16322: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.16403: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 12180 1727204054.16466: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.16487: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 12180 1727204054.16571: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.16586: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.16641: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.16733: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 12180 1727204054.17076: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.17438: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 12180 1727204054.17450: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.17487: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.17539: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.17562: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.17595: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 12180 1727204054.17611: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.17629: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.17658: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 12180 1727204054.17672: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.17715: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.17766: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 12180 1727204054.17779: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.17791: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.17820: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 12180 1727204054.17835: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.17859: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.17881: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 12180 1727204054.17899: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.17950: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.18030: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 12180 1727204054.18051: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850c46eb0> <<< 12180 1727204054.18073: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 12180 1727204054.18092: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 12180 1727204054.18249: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850c469d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 12180 1727204054.18265: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.18313: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.18368: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 12180 1727204054.18383: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.18450: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.18528: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 12180 1727204054.18541: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.18597: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.18656: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 12180 1727204054.18673: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.18703: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.18746: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 12180 1727204054.18761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 12180 1727204054.18906: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3850cb2bb0> <<< 12180 1727204054.19149: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850c56a60> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 12180 1727204054.19152: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.19193: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.19249: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 12180 1727204054.19254: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.19324: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.19401: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.19489: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.19623: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py <<< 12180 1727204054.19626: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 12180 1727204054.19667: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.19719: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 12180 1727204054.19723: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.19744: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.19788: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 12180 1727204054.19858: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204054.19864: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3850cb9040> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850cb96d0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 12180 1727204054.19867: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.19869: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.19884: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 12180 1727204054.19917: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.19967: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 12180 1727204054.19972: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.20088: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.20219: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 12180 1727204054.20222: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.20302: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.20383: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.20419: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.20463: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 12180 1727204054.20468: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 12180 1727204054.20548: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.20561: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.20679: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.20802: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py <<< 12180 1727204054.20807: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 12180 1727204054.20912: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.21020: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 12180 1727204054.21023: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.21054: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.21085: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.21517: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.21936: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 12180 1727204054.21951: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.22034: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.22127: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 12180 1727204054.22131: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.22212: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.22304: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 12180 1727204054.22307: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.22431: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.22580: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 12180 1727204054.22587: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.22591: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.22611: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available <<< 12180 1727204054.22642: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.22681: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available <<< 12180 1727204054.22771: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.22851: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.23078: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.23187: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 12180 1727204054.23204: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.23244: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.23277: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 12180 1727204054.23292: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.23301: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.23328: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 12180 1727204054.23398: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.23459: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 12180 1727204054.23476: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12180 1727204054.23505: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 12180 1727204054.23557: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.23611: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 12180 1727204054.23619: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.23661: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.23715: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 12180 1727204054.23723: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.23936: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24157: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 12180 1727204054.24164: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24212: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24268: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 12180 1727204054.24302: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24335: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 12180 1727204054.24343: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24365: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24400: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 12180 1727204054.24434: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24469: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 12180 1727204054.24477: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24539: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24614: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 12180 1727204054.24631: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12180 1727204054.24651: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 12180 1727204054.24657: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24692: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24738: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 12180 1727204054.24767: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24785: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24829: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24871: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24926: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.24993: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 12180 1727204054.25012: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.25050: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.25098: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 12180 1727204054.25261: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.25430: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 12180 1727204054.25440: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.25476: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.25518: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 12180 1727204054.25524: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.25568: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.25615: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 12180 1727204054.25622: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.25685: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.25769: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 12180 1727204054.25778: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.25843: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.25920: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 12180 1727204054.25993: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204054.26190: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 12180 1727204054.26206: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 12180 1727204054.26213: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 12180 1727204054.26247: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3850c3beb0> <<< 12180 1727204054.26254: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850c3bb80> <<< 12180 1727204054.26304: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850be8940> <<< 12180 1727204054.27287: stdout chunk (state=3): >>>import 'gc' # <<< 12180 1727204054.29098: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 12180 1727204054.29116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850c3b040> <<< 12180 1727204054.29124: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 12180 1727204054.29141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 12180 1727204054.29165: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850bfe730> <<< 12180 1727204054.29222: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204054.29257: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' <<< 12180 1727204054.29271: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850a4e280> <<< 12180 1727204054.29279: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850a4e070> <<< 12180 1727204054.29567: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 12180 1727204054.29570: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 12180 1727204054.53714: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "confi<<< 12180 1727204054.53786: stdout chunk (state=3): >>>g_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2807, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 725, "free": 2807}, "nocache": {"free": 3270, "used": 262}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 317, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264273145856, "block_size": 4096, "block_total": 65519355, "block_available": 64519811, "block_used": 999544, "inode_total": 131071472, "inode_available": 130998255, "inode_used": 73217, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "a<<< 12180 1727204054.53796: stdout chunk (state=3): >>>nsible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.43, "5m": 0.38, "15m": 0.18}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "14", "epoch": "1727204054", "epoch_int": "1727204054", "date": "2024-09-24", "time": "14:54:14", "iso8601_micro": "2024-09-24T18:54:14.497526Z", "iso8601": "2024-09-24T18:54:14Z", "iso8601_basic": "20240924T145414497526", "iso8601_basic_short": "20240924T145414", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12180 1727204054.54358: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys <<< 12180 1727204054.54488: stdout chunk (state=3): >>># cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python <<< 12180 1727204054.54503: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 12180 1727204054.54775: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 12180 1727204054.54809: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 12180 1727204054.54831: stdout chunk (state=3): >>># destroy zipimport <<< 12180 1727204054.54872: stdout chunk (state=3): >>># destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 12180 1727204054.54894: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 12180 1727204054.54907: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 12180 1727204054.54949: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 12180 1727204054.55004: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 12180 1727204054.55033: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction <<< 12180 1727204054.55089: stdout chunk (state=3): >>># destroy shlex <<< 12180 1727204054.55093: stdout chunk (state=3): >>># destroy datetime # destroy base64 <<< 12180 1727204054.55134: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 12180 1727204054.55149: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 12180 1727204054.55230: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios <<< 12180 1727204054.55285: stdout chunk (state=3): >>># cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 <<< 12180 1727204054.55367: stdout chunk (state=3): >>># cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 12180 1727204054.55406: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os <<< 12180 1727204054.55441: stdout chunk (state=3): >>># cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 12180 1727204054.55477: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 12180 1727204054.55650: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 12180 1727204054.55714: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select <<< 12180 1727204054.55732: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 12180 1727204054.55736: stdout chunk (state=3): >>># destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 12180 1727204054.55770: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 12180 1727204054.56179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204054.56183: stdout chunk (state=3): >>><<< 12180 1727204054.56185: stderr chunk (state=3): >>><<< 12180 1727204054.56352: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3852098dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f385203d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3852098b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3852098ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f385203d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f385203d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f385203d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851dcf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851dcf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851df2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851dcf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3852055880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851dc8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851df2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f385203d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d6ef10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d740a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d675b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d6f6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d6e3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851c55e50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c55940> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c55f40> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c55d90> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c66100> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d49dc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d426a0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d55700> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d75eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851c66d00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d492e0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851d55310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d7ba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c66ee0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c66e20> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c66d90> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c39400> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c394f0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c6ef70> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c68ac0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c68490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b6d250> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c24550> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c68f40> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851d7b0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b7fb80> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b7feb0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b907c0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b90d00> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b2a430> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b7ffa0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b3a310> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b90640> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b3a3d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c66a60> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b56730> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b56a00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b567f0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b568e0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b56d30> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851b60280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b56970> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b49ac0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851c66640> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851b56b20> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3851a8f700> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519ce850> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38519ce160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519ce280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519cefa0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519ce4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519cedc0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38519ce580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519ce100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519a30a0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851388370> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851388070> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851388cd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519b6dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519b63a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519b6f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851a03f40> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519d5d60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519d5430> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851981af0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38519d5550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519d5580> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38513f6fa0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851a15280> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38513f3820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851a15400> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851a15c40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38513f37c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38519ae1c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851a159d0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851a15550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851a0e940> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38513e8910> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851926dc0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38513f2550> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f38513e8eb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38513f2970> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f385194f7f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519548b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f75940> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f385198c730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519d12e0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3851946880> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850df1550> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3851957910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519a0970> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f385198a850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850cf6c70> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f57a30> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3850f579a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f9eb20> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f9e550> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f8a2e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f8a970> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3850f3b2b0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f3ba00> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f3b940> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3850d570d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f38519423a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850f8a670> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850c46eb0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850c469d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3850cb2bb0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850c56a60> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3850cb9040> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850cb96d0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_az6jny94/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3850c3beb0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850c3bb80> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850be8940> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850c3b040> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850bfe730> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850a4e280> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3850a4e070> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2807, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 725, "free": 2807}, "nocache": {"free": 3270, "used": 262}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 317, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264273145856, "block_size": 4096, "block_total": 65519355, "block_available": 64519811, "block_used": 999544, "inode_total": 131071472, "inode_available": 130998255, "inode_used": 73217, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.43, "5m": 0.38, "15m": 0.18}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "14", "epoch": "1727204054", "epoch_int": "1727204054", "date": "2024-09-24", "time": "14:54:14", "iso8601_micro": "2024-09-24T18:54:14.497526Z", "iso8601": "2024-09-24T18:54:14Z", "iso8601_basic": "20240924T145414497526", "iso8601_basic_short": "20240924T145414", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 12180 1727204054.58774: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204052.7251725-12276-43698125584022/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204054.58785: _low_level_execute_command(): starting 12180 1727204054.58893: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204052.7251725-12276-43698125584022/ > /dev/null 2>&1 && sleep 0' 12180 1727204054.60680: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204054.60685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204054.60730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204054.60734: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204054.60869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204054.60944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204054.60963: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204054.60968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204054.61052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204054.62944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204054.62948: stdout chunk (state=3): >>><<< 12180 1727204054.62950: stderr chunk (state=3): >>><<< 12180 1727204054.63273: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204054.63277: handler run complete 12180 1727204054.63279: variable 'ansible_facts' from source: unknown 12180 1727204054.63282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204054.63582: variable 'ansible_facts' from source: unknown 12180 1727204054.63781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204054.64179: attempt loop complete, returning result 12180 1727204054.64187: _execute() done 12180 1727204054.64193: dumping result to json 12180 1727204054.64230: done dumping result, returning 12180 1727204054.64250: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [0affcd87-79f5-ccb1-55ae-0000000000cd] 12180 1727204054.64259: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000cd 12180 1727204054.65259: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000cd ok: [managed-node1] 12180 1727204054.65363: no more pending results, returning what we have 12180 1727204054.65371: results queue empty 12180 1727204054.65372: checking for any_errors_fatal 12180 1727204054.65373: done checking for any_errors_fatal 12180 1727204054.65374: checking for max_fail_percentage 12180 1727204054.65375: done checking for max_fail_percentage 12180 1727204054.65376: checking to see if all hosts have failed and the running result is not ok 12180 1727204054.65377: done checking to see if all hosts have failed 12180 1727204054.65378: getting the remaining hosts for this loop 12180 1727204054.65379: done getting the remaining hosts for this loop 12180 1727204054.65383: getting the next task for host managed-node1 12180 1727204054.65390: done getting next task for host managed-node1 12180 1727204054.65392: ^ task is: TASK: meta (flush_handlers) 12180 1727204054.65394: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204054.65397: getting variables 12180 1727204054.65398: in VariableManager get_vars() 12180 1727204054.65423: Calling all_inventory to load vars for managed-node1 12180 1727204054.65426: Calling groups_inventory to load vars for managed-node1 12180 1727204054.65430: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204054.65441: Calling all_plugins_play to load vars for managed-node1 12180 1727204054.65444: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204054.65447: Calling groups_plugins_play to load vars for managed-node1 12180 1727204054.65612: WORKER PROCESS EXITING 12180 1727204054.65690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204054.66117: done with get_vars() 12180 1727204054.66245: done getting variables 12180 1727204054.66314: in VariableManager get_vars() 12180 1727204054.66324: Calling all_inventory to load vars for managed-node1 12180 1727204054.66326: Calling groups_inventory to load vars for managed-node1 12180 1727204054.66329: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204054.66334: Calling all_plugins_play to load vars for managed-node1 12180 1727204054.66336: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204054.66339: Calling groups_plugins_play to load vars for managed-node1 12180 1727204054.66938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204054.67460: done with get_vars() 12180 1727204054.67479: done queuing things up, now waiting for results queue to drain 12180 1727204054.67481: results queue empty 12180 1727204054.67482: checking for any_errors_fatal 12180 1727204054.67485: done checking for any_errors_fatal 12180 1727204054.67486: checking for max_fail_percentage 12180 1727204054.67487: done checking for max_fail_percentage 12180 1727204054.67492: checking to see if all hosts have failed and the running result is not ok 12180 1727204054.67493: done checking to see if all hosts have failed 12180 1727204054.67494: getting the remaining hosts for this loop 12180 1727204054.67495: done getting the remaining hosts for this loop 12180 1727204054.67498: getting the next task for host managed-node1 12180 1727204054.67503: done getting next task for host managed-node1 12180 1727204054.67506: ^ task is: TASK: Include the task 'el_repo_setup.yml' 12180 1727204054.67507: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204054.67509: getting variables 12180 1727204054.67510: in VariableManager get_vars() 12180 1727204054.67519: Calling all_inventory to load vars for managed-node1 12180 1727204054.67521: Calling groups_inventory to load vars for managed-node1 12180 1727204054.67523: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204054.67528: Calling all_plugins_play to load vars for managed-node1 12180 1727204054.67530: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204054.67755: Calling groups_plugins_play to load vars for managed-node1 12180 1727204054.68195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204054.68802: done with get_vars() 12180 1727204054.68812: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:11 Tuesday 24 September 2024 14:54:14 -0400 (0:00:02.076) 0:00:02.103 ***** 12180 1727204054.69110: entering _queue_task() for managed-node1/include_tasks 12180 1727204054.69112: Creating lock for include_tasks 12180 1727204054.70203: worker is 1 (out of 1 available) 12180 1727204054.70215: exiting _queue_task() for managed-node1/include_tasks 12180 1727204054.70228: done queuing things up, now waiting for results queue to drain 12180 1727204054.70230: waiting for pending results... 12180 1727204054.71455: running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' 12180 1727204054.71626: in run() - task 0affcd87-79f5-ccb1-55ae-000000000006 12180 1727204054.71881: variable 'ansible_search_path' from source: unknown 12180 1727204054.71931: calling self._execute() 12180 1727204054.72042: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204054.72307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204054.72323: variable 'omit' from source: magic vars 12180 1727204054.72485: _execute() done 12180 1727204054.72741: dumping result to json 12180 1727204054.72750: done dumping result, returning 12180 1727204054.72762: done running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' [0affcd87-79f5-ccb1-55ae-000000000006] 12180 1727204054.72776: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000006 12180 1727204054.72931: no more pending results, returning what we have 12180 1727204054.72937: in VariableManager get_vars() 12180 1727204054.72985: Calling all_inventory to load vars for managed-node1 12180 1727204054.72988: Calling groups_inventory to load vars for managed-node1 12180 1727204054.72992: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204054.73008: Calling all_plugins_play to load vars for managed-node1 12180 1727204054.73011: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204054.73015: Calling groups_plugins_play to load vars for managed-node1 12180 1727204054.73211: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000006 12180 1727204054.73214: WORKER PROCESS EXITING 12180 1727204054.73240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204054.73437: done with get_vars() 12180 1727204054.73444: variable 'ansible_search_path' from source: unknown 12180 1727204054.73460: we have included files to process 12180 1727204054.73461: generating all_blocks data 12180 1727204054.73462: done generating all_blocks data 12180 1727204054.73465: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12180 1727204054.73467: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12180 1727204054.73470: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12180 1727204054.75549: in VariableManager get_vars() 12180 1727204054.75695: done with get_vars() 12180 1727204054.75708: done processing included file 12180 1727204054.75711: iterating over new_blocks loaded from include file 12180 1727204054.75712: in VariableManager get_vars() 12180 1727204054.75722: done with get_vars() 12180 1727204054.75724: filtering new block on tags 12180 1727204054.75739: done filtering new block on tags 12180 1727204054.75742: in VariableManager get_vars() 12180 1727204054.75977: done with get_vars() 12180 1727204054.75980: filtering new block on tags 12180 1727204054.75997: done filtering new block on tags 12180 1727204054.75999: in VariableManager get_vars() 12180 1727204054.76172: done with get_vars() 12180 1727204054.76174: filtering new block on tags 12180 1727204054.76187: done filtering new block on tags 12180 1727204054.76189: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node1 12180 1727204054.76196: extending task lists for all hosts with included blocks 12180 1727204054.76369: done extending task lists 12180 1727204054.76371: done processing included files 12180 1727204054.76372: results queue empty 12180 1727204054.76373: checking for any_errors_fatal 12180 1727204054.76374: done checking for any_errors_fatal 12180 1727204054.76375: checking for max_fail_percentage 12180 1727204054.76376: done checking for max_fail_percentage 12180 1727204054.76377: checking to see if all hosts have failed and the running result is not ok 12180 1727204054.76377: done checking to see if all hosts have failed 12180 1727204054.76378: getting the remaining hosts for this loop 12180 1727204054.76379: done getting the remaining hosts for this loop 12180 1727204054.76382: getting the next task for host managed-node1 12180 1727204054.76386: done getting next task for host managed-node1 12180 1727204054.76388: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 12180 1727204054.76390: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204054.76392: getting variables 12180 1727204054.76393: in VariableManager get_vars() 12180 1727204054.76401: Calling all_inventory to load vars for managed-node1 12180 1727204054.76403: Calling groups_inventory to load vars for managed-node1 12180 1727204054.76406: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204054.76411: Calling all_plugins_play to load vars for managed-node1 12180 1727204054.76414: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204054.76417: Calling groups_plugins_play to load vars for managed-node1 12180 1727204054.77012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204054.77397: done with get_vars() 12180 1727204054.77407: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.087) 0:00:02.190 ***** 12180 1727204054.77817: entering _queue_task() for managed-node1/setup 12180 1727204054.78851: worker is 1 (out of 1 available) 12180 1727204054.78862: exiting _queue_task() for managed-node1/setup 12180 1727204054.78878: done queuing things up, now waiting for results queue to drain 12180 1727204054.78880: waiting for pending results... 12180 1727204054.80267: running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 12180 1727204054.80968: in run() - task 0affcd87-79f5-ccb1-55ae-0000000000de 12180 1727204054.81288: variable 'ansible_search_path' from source: unknown 12180 1727204054.81296: variable 'ansible_search_path' from source: unknown 12180 1727204054.81342: calling self._execute() 12180 1727204054.81432: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204054.81445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204054.81460: variable 'omit' from source: magic vars 12180 1727204054.83184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204054.90606: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204054.90806: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204054.90857: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204054.91004: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204054.91039: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204054.91236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204054.91271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204054.91323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204054.91437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204054.91519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204054.91917: variable 'ansible_facts' from source: unknown 12180 1727204054.92224: variable 'network_test_required_facts' from source: task vars 12180 1727204054.92320: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 12180 1727204054.92482: variable 'omit' from source: magic vars 12180 1727204054.92622: variable 'omit' from source: magic vars 12180 1727204054.92662: variable 'omit' from source: magic vars 12180 1727204054.92732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204054.92941: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204054.92969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204054.93041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204054.93057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204054.93261: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204054.93273: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204054.93282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204054.93512: Set connection var ansible_pipelining to False 12180 1727204054.93577: Set connection var ansible_shell_type to sh 12180 1727204054.93689: Set connection var ansible_timeout to 10 12180 1727204054.93797: Set connection var ansible_connection to ssh 12180 1727204054.93809: Set connection var ansible_shell_executable to /bin/sh 12180 1727204054.93819: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204054.93854: variable 'ansible_shell_executable' from source: unknown 12180 1727204054.93863: variable 'ansible_connection' from source: unknown 12180 1727204054.93875: variable 'ansible_module_compression' from source: unknown 12180 1727204054.93882: variable 'ansible_shell_type' from source: unknown 12180 1727204054.93893: variable 'ansible_shell_executable' from source: unknown 12180 1727204054.94007: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204054.94029: variable 'ansible_pipelining' from source: unknown 12180 1727204054.94068: variable 'ansible_timeout' from source: unknown 12180 1727204054.94080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204054.94379: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204054.94395: variable 'omit' from source: magic vars 12180 1727204054.94406: starting attempt loop 12180 1727204054.94450: running the handler 12180 1727204054.94471: _low_level_execute_command(): starting 12180 1727204054.94483: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204054.96668: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204054.96686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204054.96700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204054.96719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204054.96775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204054.96789: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204054.96804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204054.96825: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204054.96842: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204054.96856: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204054.96878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204054.96895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204054.96985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204054.96997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204054.97006: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204054.97017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204054.97188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204054.97325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204054.97344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204054.97444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204054.99158: stdout chunk (state=3): >>>/root <<< 12180 1727204054.99414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204054.99418: stdout chunk (state=3): >>><<< 12180 1727204054.99420: stderr chunk (state=3): >>><<< 12180 1727204054.99542: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204054.99546: _low_level_execute_command(): starting 12180 1727204054.99549: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204054.9944718-12480-42460675977174 `" && echo ansible-tmp-1727204054.9944718-12480-42460675977174="` echo /root/.ansible/tmp/ansible-tmp-1727204054.9944718-12480-42460675977174 `" ) && sleep 0' 12180 1727204055.01414: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204055.01551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204055.01573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204055.01593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204055.01648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204055.01769: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204055.01784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204055.01801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204055.01811: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204055.01821: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204055.01837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204055.01852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204055.01878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204055.01891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204055.01903: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204055.01918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204055.02003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204055.02102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204055.02119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204055.02213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204055.04039: stdout chunk (state=3): >>>ansible-tmp-1727204054.9944718-12480-42460675977174=/root/.ansible/tmp/ansible-tmp-1727204054.9944718-12480-42460675977174 <<< 12180 1727204055.04188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204055.04271: stderr chunk (state=3): >>><<< 12180 1727204055.04274: stdout chunk (state=3): >>><<< 12180 1727204055.04374: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204054.9944718-12480-42460675977174=/root/.ansible/tmp/ansible-tmp-1727204054.9944718-12480-42460675977174 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204055.04378: variable 'ansible_module_compression' from source: unknown 12180 1727204055.04484: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12180 1727204055.04488: variable 'ansible_facts' from source: unknown 12180 1727204055.04668: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204054.9944718-12480-42460675977174/AnsiballZ_setup.py 12180 1727204055.05390: Sending initial data 12180 1727204055.05393: Sent initial data (153 bytes) 12180 1727204055.07839: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204055.07974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204055.07990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204055.08009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204055.08055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204055.08182: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204055.08196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204055.08214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204055.08235: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204055.08248: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204055.08261: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204055.08284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204055.08301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204055.08315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204055.08336: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204055.08351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204055.08438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204055.08491: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204055.08514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204055.08605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204055.10309: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204055.10366: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204055.10417: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpxlbn04vw /root/.ansible/tmp/ansible-tmp-1727204054.9944718-12480-42460675977174/AnsiballZ_setup.py <<< 12180 1727204055.10471: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204055.13473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204055.13638: stderr chunk (state=3): >>><<< 12180 1727204055.13642: stdout chunk (state=3): >>><<< 12180 1727204055.13645: done transferring module to remote 12180 1727204055.13647: _low_level_execute_command(): starting 12180 1727204055.13649: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204054.9944718-12480-42460675977174/ /root/.ansible/tmp/ansible-tmp-1727204054.9944718-12480-42460675977174/AnsiballZ_setup.py && sleep 0' 12180 1727204055.15168: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204055.15186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204055.15202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204055.15223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204055.15275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204055.15479: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204055.15496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204055.15514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204055.15529: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204055.15542: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204055.15555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204055.15572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204055.15589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204055.15602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204055.15616: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204055.15633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204055.15708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204055.15736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204055.15754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204055.15848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204055.17671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204055.17695: stdout chunk (state=3): >>><<< 12180 1727204055.17699: stderr chunk (state=3): >>><<< 12180 1727204055.17793: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204055.17796: _low_level_execute_command(): starting 12180 1727204055.17799: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204054.9944718-12480-42460675977174/AnsiballZ_setup.py && sleep 0' 12180 1727204055.21360: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204055.21366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204055.21399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204055.21403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204055.21405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204055.21926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204055.22007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204055.22093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204055.24056: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 12180 1727204055.24060: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 12180 1727204055.24117: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 12180 1727204055.24150: stdout chunk (state=3): >>>import 'posix' # <<< 12180 1727204055.24183: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 12180 1727204055.24232: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 12180 1727204055.24299: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204055.24318: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 12180 1727204055.24353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 12180 1727204055.24356: stdout chunk (state=3): >>>import '_codecs' # <<< 12180 1727204055.24369: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e98dc0> <<< 12180 1727204055.24414: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 12180 1727204055.24422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e3d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e98b20> <<< 12180 1727204055.24455: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 12180 1727204055.24476: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e98ac0> <<< 12180 1727204055.24488: stdout chunk (state=3): >>>import '_signal' # <<< 12180 1727204055.24523: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 12180 1727204055.24526: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e3d490> <<< 12180 1727204055.24549: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 12180 1727204055.24572: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 12180 1727204055.24593: stdout chunk (state=3): >>>import '_abc' # <<< 12180 1727204055.24603: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e3d940> <<< 12180 1727204055.24618: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e3d670> <<< 12180 1727204055.24645: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 12180 1727204055.24663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 12180 1727204055.24677: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 12180 1727204055.24701: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 12180 1727204055.24712: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 12180 1727204055.24741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 12180 1727204055.24760: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672bcf190> <<< 12180 1727204055.24780: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 12180 1727204055.24806: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 12180 1727204055.24877: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672bcf220> <<< 12180 1727204055.24905: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 12180 1727204055.24944: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672bf2850> <<< 12180 1727204055.24949: stdout chunk (state=3): >>>import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672bcf940> <<< 12180 1727204055.24980: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e55880> <<< 12180 1727204055.25005: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 12180 1727204055.25011: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672bc8d90> <<< 12180 1727204055.25074: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 12180 1727204055.25081: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672bf2d90> <<< 12180 1727204055.25133: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e3d970> <<< 12180 1727204055.25165: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12180 1727204055.25491: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 12180 1727204055.25515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 12180 1727204055.25536: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 12180 1727204055.25556: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 12180 1727204055.25583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 12180 1727204055.25597: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 12180 1727204055.25614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 12180 1727204055.25620: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b93f10> <<< 12180 1727204055.25677: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b990a0> <<< 12180 1727204055.25684: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 12180 1727204055.25700: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 12180 1727204055.25711: stdout chunk (state=3): >>>import '_sre' # <<< 12180 1727204055.25744: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 12180 1727204055.25749: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 12180 1727204055.25774: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 12180 1727204055.25797: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b8c5b0> <<< 12180 1727204055.25812: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b946a0> <<< 12180 1727204055.25830: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b933d0> <<< 12180 1727204055.25842: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 12180 1727204055.25915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 12180 1727204055.25936: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 12180 1727204055.25968: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204055.25987: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 12180 1727204055.25993: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 12180 1727204055.26028: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204055.26048: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672a7aeb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a7a9a0> <<< 12180 1727204055.26054: stdout chunk (state=3): >>>import 'itertools' # <<< 12180 1727204055.26074: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a7afa0> <<< 12180 1727204055.26107: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 12180 1727204055.26113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 12180 1727204055.26141: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a7adf0> <<< 12180 1727204055.26171: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 12180 1727204055.26185: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8a160> <<< 12180 1727204055.26191: stdout chunk (state=3): >>>import '_collections' # <<< 12180 1727204055.26238: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b6ee20> <<< 12180 1727204055.26245: stdout chunk (state=3): >>>import '_functools' # <<< 12180 1727204055.26267: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b66700> <<< 12180 1727204055.26331: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b7a760> <<< 12180 1727204055.26335: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b9aeb0> <<< 12180 1727204055.26352: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 12180 1727204055.26392: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672a8ad60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b6e340> <<< 12180 1727204055.26444: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672b7a370> <<< 12180 1727204055.26451: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672ba0a60> <<< 12180 1727204055.26472: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 12180 1727204055.26482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 12180 1727204055.26499: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204055.26519: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 12180 1727204055.26546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 12180 1727204055.26552: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8af40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8ae80> <<< 12180 1727204055.26582: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8adf0> <<< 12180 1727204055.26612: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 12180 1727204055.26615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 12180 1727204055.26641: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 12180 1727204055.26677: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 12180 1727204055.26714: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 12180 1727204055.26748: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a5e460> <<< 12180 1727204055.26767: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 12180 1727204055.26780: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 12180 1727204055.26814: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a5e550> <<< 12180 1727204055.26935: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a3c0d0> <<< 12180 1727204055.26977: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8db20> <<< 12180 1727204055.26983: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8d4c0> <<< 12180 1727204055.27009: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 12180 1727204055.27056: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 12180 1727204055.27059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 12180 1727204055.27095: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 12180 1727204055.27101: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76729922b0> <<< 12180 1727204055.27132: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a49d60> <<< 12180 1727204055.27182: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8dfa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672ba00d0> <<< 12180 1727204055.27216: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 12180 1727204055.27234: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 12180 1727204055.27261: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76729a2be0> <<< 12180 1727204055.27276: stdout chunk (state=3): >>>import 'errno' # <<< 12180 1727204055.27312: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76729a2f10> <<< 12180 1727204055.27335: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 12180 1727204055.27343: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 12180 1727204055.27363: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 12180 1727204055.27380: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76729b5820> <<< 12180 1727204055.27394: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 12180 1727204055.27432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 12180 1727204055.27453: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76729b5d60> <<< 12180 1727204055.27498: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767294e490> <<< 12180 1727204055.27507: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76729a2f40> <<< 12180 1727204055.27531: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 12180 1727204055.27583: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767295e370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76729b56a0> <<< 12180 1727204055.27597: stdout chunk (state=3): >>>import 'pwd' # <<< 12180 1727204055.27624: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767295e430> <<< 12180 1727204055.27661: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8aac0> <<< 12180 1727204055.27685: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 12180 1727204055.27697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 12180 1727204055.27729: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 12180 1727204055.27733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 12180 1727204055.27778: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767297a790> <<< 12180 1727204055.27798: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 12180 1727204055.27831: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767297aa60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767297a850> <<< 12180 1727204055.27888: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767297a940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 12180 1727204055.28111: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767297ad90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204055.28129: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76729842e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767297a9d0> <<< 12180 1727204055.28151: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767296eb20> <<< 12180 1727204055.28170: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8a6a0> <<< 12180 1727204055.28212: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 12180 1727204055.28265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 12180 1727204055.28283: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767297ab80> <<< 12180 1727204055.28433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 12180 1727204055.28444: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f767289d760> <<< 12180 1727204055.28749: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip' # zipimport: zlib available <<< 12180 1727204055.28846: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.28886: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 12180 1727204055.28921: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.28940: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 12180 1727204055.30145: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.31067: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727da8b0> <<< 12180 1727204055.31075: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204055.31102: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 12180 1727204055.31132: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 12180 1727204055.31149: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76727da160> <<< 12180 1727204055.31192: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727da280> <<< 12180 1727204055.31221: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727da5e0> <<< 12180 1727204055.31255: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 12180 1727204055.31258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 12180 1727204055.31299: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727da4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727dae20> <<< 12180 1727204055.31303: stdout chunk (state=3): >>>import 'atexit' # <<< 12180 1727204055.31332: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76727da580> <<< 12180 1727204055.31346: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 12180 1727204055.31375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 12180 1727204055.31412: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727da100> <<< 12180 1727204055.31428: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 12180 1727204055.31449: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 12180 1727204055.31460: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 12180 1727204055.31505: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 12180 1727204055.31508: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 12180 1727204055.31510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 12180 1727204055.31594: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767276f040> <<< 12180 1727204055.31630: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76721893d0> <<< 12180 1727204055.31669: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204055.31675: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76721890d0> <<< 12180 1727204055.31696: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 12180 1727204055.31726: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672189d30> <<< 12180 1727204055.31743: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727c2d90> <<< 12180 1727204055.31907: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727c23a0> <<< 12180 1727204055.31935: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 12180 1727204055.31956: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727c2f40> <<< 12180 1727204055.31981: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 12180 1727204055.31984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 12180 1727204055.32020: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 12180 1727204055.32054: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 12180 1727204055.32060: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 12180 1727204055.32086: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 12180 1727204055.32089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767289da90> <<< 12180 1727204055.32168: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672798dc0> <<< 12180 1727204055.32171: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672798490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727d7a90> <<< 12180 1727204055.32200: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76727985b0> <<< 12180 1727204055.32224: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204055.32231: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727985e0> <<< 12180 1727204055.32252: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 12180 1727204055.32281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 12180 1727204055.32284: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 12180 1727204055.32315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 12180 1727204055.32389: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204055.32395: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76721f4f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76728232e0> <<< 12180 1727204055.32416: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 12180 1727204055.32419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 12180 1727204055.32476: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76721f17f0> <<< 12180 1727204055.32491: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672823460> <<< 12180 1727204055.32513: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 12180 1727204055.32541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204055.32561: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 12180 1727204055.32578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 12180 1727204055.32638: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672823c40> <<< 12180 1727204055.32758: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76721f1790> <<< 12180 1727204055.32849: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672823130> <<< 12180 1727204055.32881: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204055.32893: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672823670> <<< 12180 1727204055.32930: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204055.32947: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672823730> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767281c9a0> <<< 12180 1727204055.32965: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 12180 1727204055.32981: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 12180 1727204055.32996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 12180 1727204055.33042: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76721e78e0> <<< 12180 1727204055.33208: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204055.33222: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672733c70> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76721f0520> <<< 12180 1727204055.33269: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76721e7e80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76721f0940> <<< 12180 1727204055.33287: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12180 1727204055.33302: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 12180 1727204055.33382: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.33456: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.33488: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available <<< 12180 1727204055.33504: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 12180 1727204055.33517: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.33627: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.33804: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.34149: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.34612: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py <<< 12180 1727204055.34616: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 12180 1727204055.34653: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 12180 1727204055.34657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204055.34706: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767272e790> <<< 12180 1727204055.34781: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 12180 1727204055.34786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767276d850> <<< 12180 1727204055.34806: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671da5fa0> <<< 12180 1727204055.34851: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 12180 1727204055.34855: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.34875: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.34888: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 12180 1727204055.35021: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.35153: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 12180 1727204055.35157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 12180 1727204055.35182: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727a0310> <<< 12180 1727204055.35185: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.35568: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.35927: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.35982: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.36055: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/collections.py <<< 12180 1727204055.36059: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.36089: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.36133: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 12180 1727204055.36136: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.36185: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.36287: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/errors.py <<< 12180 1727204055.36295: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.36297: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.36300: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 12180 1727204055.36311: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.36334: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.36377: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 12180 1727204055.36380: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.36570: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.36752: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 12180 1727204055.36793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 12180 1727204055.36796: stdout chunk (state=3): >>>import '_ast' # <<< 12180 1727204055.36862: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727e0ca0> <<< 12180 1727204055.36867: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.36927: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.37019: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/validation.py <<< 12180 1727204055.37023: stdout chunk (state=3): >>>import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 12180 1727204055.37025: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 12180 1727204055.37036: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.37067: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.37106: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 12180 1727204055.37109: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.37143: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.37184: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.37272: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.37338: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 12180 1727204055.37361: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204055.37430: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672751c70> <<< 12180 1727204055.37515: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727e0bb0> <<< 12180 1727204055.37551: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/file.py <<< 12180 1727204055.37555: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 12180 1727204055.37614: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.37665: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.37693: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.37746: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 12180 1727204055.37750: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 12180 1727204055.37762: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 12180 1727204055.37804: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 12180 1727204055.37815: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 12180 1727204055.37847: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 12180 1727204055.37919: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727302b0> <<< 12180 1727204055.37959: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727aeb80> <<< 12180 1727204055.38032: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671c17eb0> <<< 12180 1727204055.38036: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 12180 1727204055.38053: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38079: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py <<< 12180 1727204055.38082: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 12180 1727204055.38183: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 12180 1727204055.38186: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38188: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38190: stdout chunk (state=3): >>>import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/modules/__init__.py <<< 12180 1727204055.38201: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38265: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38307: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38323: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38340: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38382: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38418: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38450: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38490: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 12180 1727204055.38494: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38552: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38620: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38635: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38678: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py <<< 12180 1727204055.38681: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38825: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38960: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.38996: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.39052: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py <<< 12180 1727204055.39055: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204055.39073: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 12180 1727204055.39105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 12180 1727204055.39107: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 12180 1727204055.39109: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 12180 1727204055.39141: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671b02100> <<< 12180 1727204055.39169: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 12180 1727204055.39173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 12180 1727204055.39186: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 12180 1727204055.39211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 12180 1727204055.39252: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 12180 1727204055.39267: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d66a60> <<< 12180 1727204055.39302: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204055.39305: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7671d669d0> <<< 12180 1727204055.39385: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d39c70> <<< 12180 1727204055.39388: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d39c10> <<< 12180 1727204055.39416: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d85460> <<< 12180 1727204055.39419: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d853d0> <<< 12180 1727204055.39433: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 12180 1727204055.39453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 12180 1727204055.39477: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 12180 1727204055.39483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 12180 1727204055.39518: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7671d49310> <<< 12180 1727204055.39521: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d499a0> <<< 12180 1727204055.39552: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 12180 1727204055.39593: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d49940> <<< 12180 1727204055.39596: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 12180 1727204055.39623: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 12180 1727204055.39649: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204055.39666: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7671b640d0> <<< 12180 1727204055.39679: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767282cc40> <<< 12180 1727204055.39710: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d85790> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 12180 1727204055.39752: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 12180 1727204055.39756: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 12180 1727204055.39770: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.39817: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.39870: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 12180 1727204055.39885: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.39918: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.39963: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 12180 1727204055.39984: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12180 1727204055.39999: stdout chunk (state=3): >>>import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 12180 1727204055.40010: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.40036: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.40070: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 12180 1727204055.40081: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.40108: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.40162: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 12180 1727204055.40182: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.40201: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.40242: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 12180 1727204055.40254: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.40299: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.40351: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.40394: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.40453: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 12180 1727204055.40469: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.40845: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.41209: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 12180 1727204055.41213: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.41266: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.41304: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.41340: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.41388: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py <<< 12180 1727204055.41392: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 12180 1727204055.41396: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.41411: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.41433: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 12180 1727204055.41446: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.41495: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.41540: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available <<< 12180 1727204055.41580: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.41609: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 12180 1727204055.41633: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.41636: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.41669: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 12180 1727204055.41678: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.41738: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.41809: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 12180 1727204055.41842: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671a54f10> <<< 12180 1727204055.41854: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 12180 1727204055.41885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 12180 1727204055.42047: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671a549d0> <<< 12180 1727204055.42051: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 12180 1727204055.42053: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.42103: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.42166: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 12180 1727204055.42169: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.42241: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.42324: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 12180 1727204055.42330: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.42386: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.42459: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 12180 1727204055.42463: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.42495: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.42542: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 12180 1727204055.42554: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 12180 1727204055.42699: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7671a7dc10> <<< 12180 1727204055.42942: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671ac6c40> <<< 12180 1727204055.42947: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 12180 1727204055.42949: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.42990: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.43045: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 12180 1727204055.43048: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.43113: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.43190: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.43282: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.43428: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 12180 1727204055.43432: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.43460: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.43505: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 12180 1727204055.43509: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.43544: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.43592: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py <<< 12180 1727204055.43595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 12180 1727204055.43656: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204055.43661: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7671ac85e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671ac8790> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 12180 1727204055.43692: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 12180 1727204055.43695: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.43725: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.43779: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 12180 1727204055.43784: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.43900: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.44034: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 12180 1727204055.44037: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.44120: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.44198: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.44240: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.44284: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 12180 1727204055.44288: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 12180 1727204055.44290: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.44357: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.44374: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.44501: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.44896: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available <<< 12180 1727204055.44899: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.45329: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.45747: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 12180 1727204055.45769: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.45846: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.45942: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 12180 1727204055.45945: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.46029: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.46118: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 12180 1727204055.46121: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.46247: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.46408: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 12180 1727204055.46412: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.46418: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.46420: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 12180 1727204055.46442: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.46453: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.46501: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 12180 1727204055.46504: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.46589: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.46672: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.46846: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.47021: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py <<< 12180 1727204055.47024: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 12180 1727204055.47029: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.47060: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.47101: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available <<< 12180 1727204055.47117: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.47152: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 12180 1727204055.47155: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.47215: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.47286: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 12180 1727204055.47290: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.47311: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.47344: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 12180 1727204055.47390: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.47446: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 12180 1727204055.47449: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.47498: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.47554: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 12180 1727204055.47557: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.47768: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.47990: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 12180 1727204055.47994: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48042: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48098: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 12180 1727204055.48102: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48132: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48174: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 12180 1727204055.48183: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48200: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48238: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 12180 1727204055.48241: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48265: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48306: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 12180 1727204055.48310: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48376: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48476: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 12180 1727204055.48480: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48482: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48484: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 12180 1727204055.48500: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48531: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48578: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 12180 1727204055.48582: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48597: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48617: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48654: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48697: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48756: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48831: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py <<< 12180 1727204055.48835: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 12180 1727204055.48852: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48885: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.48936: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 12180 1727204055.48940: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.49099: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.49265: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 12180 1727204055.49269: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.49305: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.49355: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 12180 1727204055.49358: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.49402: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.49441: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 12180 1727204055.49514: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.49589: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py <<< 12180 1727204055.49593: stdout chunk (state=3): >>>import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available <<< 12180 1727204055.49671: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.49753: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 12180 1727204055.49758: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 12180 1727204055.49812: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204055.50061: stdout chunk (state=3): >>>import 'gc' # <<< 12180 1727204055.50891: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 12180 1727204055.50895: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 12180 1727204055.50917: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 12180 1727204055.50974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 12180 1727204055.50978: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204055.50981: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7671868790> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767183ad00> <<< 12180 1727204055.51035: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767183aa60> <<< 12180 1727204055.51449: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_<<< 12180 1727204055.51480: stdout chunk (state=3): >>>IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "15", "epoch": "1727204055", "epoch_int": "1727204055", "date": "2024-09-24", "time": "14:54:15", "iso8601_micro": "2024-09-24T18:54:15.503740Z", "iso8601": "2024-09-24T18:54:15Z", "iso8601_basic": "20240924T145415503740", "iso8601_basic_short": "20240924T145415", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12180 1727204055.51938: stdout chunk (state=3): >>># clear builtins._ # clear sys.path <<< 12180 1727204055.51947: stdout chunk (state=3): >>># clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 12180 1727204055.51955: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp <<< 12180 1727204055.51960: stdout chunk (state=3): >>># cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 <<< 12180 1727204055.51967: stdout chunk (state=3): >>># cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants <<< 12180 1727204055.52073: stdout chunk (state=3): >>># destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader <<< 12180 1727204055.52080: stdout chunk (state=3): >>># cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common <<< 12180 1727204055.52086: stdout chunk (state=3): >>># destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text <<< 12180 1727204055.52092: stdout chunk (state=3): >>># cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters <<< 12180 1727204055.52098: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context <<< 12180 1727204055.52110: stdout chunk (state=3): >>># cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai <<< 12180 1727204055.52161: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin <<< 12180 1727204055.52168: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux <<< 12180 1727204055.52174: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 12180 1727204055.52431: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 12180 1727204055.52467: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 12180 1727204055.52487: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 12180 1727204055.52530: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 12180 1727204055.52534: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 12180 1727204055.52536: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 12180 1727204055.52553: stdout chunk (state=3): >>># destroy _json # destroy encodings <<< 12180 1727204055.52576: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 12180 1727204055.52604: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 12180 1727204055.52660: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 12180 1727204055.52667: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array <<< 12180 1727204055.52670: stdout chunk (state=3): >>># destroy _compat_pickle <<< 12180 1727204055.52728: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 12180 1727204055.52733: stdout chunk (state=3): >>># destroy shlex <<< 12180 1727204055.52736: stdout chunk (state=3): >>># destroy datetime <<< 12180 1727204055.52739: stdout chunk (state=3): >>># destroy base64 <<< 12180 1727204055.52768: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct <<< 12180 1727204055.52784: stdout chunk (state=3): >>># destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout <<< 12180 1727204055.52787: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.collector <<< 12180 1727204055.52843: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 12180 1727204055.52847: stdout chunk (state=3): >>># cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 12180 1727204055.52872: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 12180 1727204055.52888: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl <<< 12180 1727204055.52912: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random <<< 12180 1727204055.52935: stdout chunk (state=3): >>># cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 12180 1727204055.52972: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator <<< 12180 1727204055.52996: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath <<< 12180 1727204055.53016: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc <<< 12180 1727204055.53040: stdout chunk (state=3): >>># cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings <<< 12180 1727204055.53075: stdout chunk (state=3): >>># cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 12180 1727204055.53097: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma <<< 12180 1727204055.53110: stdout chunk (state=3): >>># destroy zlib # destroy _signal <<< 12180 1727204055.53253: stdout chunk (state=3): >>># destroy platform <<< 12180 1727204055.53292: stdout chunk (state=3): >>># destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 12180 1727204055.53310: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 12180 1727204055.53397: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks <<< 12180 1727204055.53744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204055.53754: stdout chunk (state=3): >>><<< 12180 1727204055.53770: stderr chunk (state=3): >>><<< 12180 1727204055.53922: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e98dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e3d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e98b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e98ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e3d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e3d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e3d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672bcf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672bcf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672bf2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672bcf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e55880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672bc8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672bf2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672e3d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b93f10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b990a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b8c5b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b946a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b933d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672a7aeb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a7a9a0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a7afa0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a7adf0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8a160> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b6ee20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b66700> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b7a760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b9aeb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672a8ad60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672b6e340> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672b7a370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672ba0a60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8af40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8ae80> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8adf0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a5e460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a5e550> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a3c0d0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8db20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8d4c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76729922b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a49d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8dfa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672ba00d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76729a2be0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76729a2f10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76729b5820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76729b5d60> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767294e490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76729a2f40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767295e370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76729b56a0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767295e430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8aac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767297a790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767297aa60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767297a850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767297a940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767297ad90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76729842e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767297a9d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767296eb20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672a8a6a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767297ab80> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f767289d760> # zipimport: found 103 names in '/tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727da8b0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76727da160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727da280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727da5e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727da4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727dae20> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76727da580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727da100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767276f040> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76721893d0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76721890d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672189d30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727c2d90> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727c23a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727c2f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767289da90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672798dc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672798490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727d7a90> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76727985b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727985e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76721f4f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76728232e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76721f17f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672823460> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7672823c40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76721f1790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672823130> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672823670> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672823730> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767281c9a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76721e78e0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672733c70> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76721f0520> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f76721e7e80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76721f0940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f767272e790> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767276d850> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671da5fa0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727a0310> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727e0ca0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7672751c70> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727e0bb0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727302b0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f76727aeb80> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671c17eb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671b02100> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d66a60> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7671d669d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d39c70> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d39c10> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d85460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d853d0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7671d49310> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d499a0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d49940> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7671b640d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767282cc40> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671d85790> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671a54f10> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671a549d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7671a7dc10> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671ac6c40> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7671ac85e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7671ac8790> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_1g18t09f/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7671868790> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767183ad00> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f767183aa60> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "15", "epoch": "1727204055", "epoch_int": "1727204055", "date": "2024-09-24", "time": "14:54:15", "iso8601_micro": "2024-09-24T18:54:15.503740Z", "iso8601": "2024-09-24T18:54:15Z", "iso8601_basic": "20240924T145415503740", "iso8601_basic_short": "20240924T145415", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 12180 1727204055.55824: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204054.9944718-12480-42460675977174/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204055.55832: _low_level_execute_command(): starting 12180 1727204055.55835: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204054.9944718-12480-42460675977174/ > /dev/null 2>&1 && sleep 0' 12180 1727204055.58176: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204055.58180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204055.58221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204055.58224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204055.58229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204055.58291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204055.58295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204055.58306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204055.58385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204055.60192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204055.60268: stderr chunk (state=3): >>><<< 12180 1727204055.60272: stdout chunk (state=3): >>><<< 12180 1727204055.60571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204055.60575: handler run complete 12180 1727204055.60578: variable 'ansible_facts' from source: unknown 12180 1727204055.60580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204055.60583: variable 'ansible_facts' from source: unknown 12180 1727204055.60609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204055.60694: attempt loop complete, returning result 12180 1727204055.60706: _execute() done 12180 1727204055.60714: dumping result to json 12180 1727204055.60734: done dumping result, returning 12180 1727204055.60748: done running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcd87-79f5-ccb1-55ae-0000000000de] 12180 1727204055.60757: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000de ok: [managed-node1] 12180 1727204055.61091: no more pending results, returning what we have 12180 1727204055.61094: results queue empty 12180 1727204055.61095: checking for any_errors_fatal 12180 1727204055.61096: done checking for any_errors_fatal 12180 1727204055.61096: checking for max_fail_percentage 12180 1727204055.61098: done checking for max_fail_percentage 12180 1727204055.61099: checking to see if all hosts have failed and the running result is not ok 12180 1727204055.61099: done checking to see if all hosts have failed 12180 1727204055.61100: getting the remaining hosts for this loop 12180 1727204055.61101: done getting the remaining hosts for this loop 12180 1727204055.61105: getting the next task for host managed-node1 12180 1727204055.61114: done getting next task for host managed-node1 12180 1727204055.61116: ^ task is: TASK: Check if system is ostree 12180 1727204055.61118: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204055.61122: getting variables 12180 1727204055.61124: in VariableManager get_vars() 12180 1727204055.61146: Calling all_inventory to load vars for managed-node1 12180 1727204055.61148: Calling groups_inventory to load vars for managed-node1 12180 1727204055.61151: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204055.61162: Calling all_plugins_play to load vars for managed-node1 12180 1727204055.61167: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204055.61171: Calling groups_plugins_play to load vars for managed-node1 12180 1727204055.61333: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000de 12180 1727204055.61337: WORKER PROCESS EXITING 12180 1727204055.61351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204055.61553: done with get_vars() 12180 1727204055.61563: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:54:15 -0400 (0:00:00.838) 0:00:03.029 ***** 12180 1727204055.61719: entering _queue_task() for managed-node1/stat 12180 1727204055.62714: worker is 1 (out of 1 available) 12180 1727204055.62726: exiting _queue_task() for managed-node1/stat 12180 1727204055.62737: done queuing things up, now waiting for results queue to drain 12180 1727204055.62739: waiting for pending results... 12180 1727204055.63419: running TaskExecutor() for managed-node1/TASK: Check if system is ostree 12180 1727204055.63643: in run() - task 0affcd87-79f5-ccb1-55ae-0000000000e0 12180 1727204055.63662: variable 'ansible_search_path' from source: unknown 12180 1727204055.63673: variable 'ansible_search_path' from source: unknown 12180 1727204055.63717: calling self._execute() 12180 1727204055.63906: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204055.63918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204055.63938: variable 'omit' from source: magic vars 12180 1727204055.66082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204055.67073: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204055.67123: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204055.67388: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204055.67431: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204055.67737: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204055.67801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204055.67838: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204055.68012: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204055.68475: Evaluated conditional (not __network_is_ostree is defined): True 12180 1727204055.68566: variable 'omit' from source: magic vars 12180 1727204055.69701: variable 'omit' from source: magic vars 12180 1727204055.70116: variable 'omit' from source: magic vars 12180 1727204055.70157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204055.70195: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204055.70219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204055.70243: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204055.70257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204055.70293: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204055.70302: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204055.70310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204055.70411: Set connection var ansible_pipelining to False 12180 1727204055.70686: Set connection var ansible_shell_type to sh 12180 1727204055.70699: Set connection var ansible_timeout to 10 12180 1727204055.70708: Set connection var ansible_connection to ssh 12180 1727204055.70717: Set connection var ansible_shell_executable to /bin/sh 12180 1727204055.70725: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204055.70761: variable 'ansible_shell_executable' from source: unknown 12180 1727204055.71276: variable 'ansible_connection' from source: unknown 12180 1727204055.71285: variable 'ansible_module_compression' from source: unknown 12180 1727204055.71293: variable 'ansible_shell_type' from source: unknown 12180 1727204055.71301: variable 'ansible_shell_executable' from source: unknown 12180 1727204055.71309: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204055.71318: variable 'ansible_pipelining' from source: unknown 12180 1727204055.71329: variable 'ansible_timeout' from source: unknown 12180 1727204055.71338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204055.71496: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204055.71586: variable 'omit' from source: magic vars 12180 1727204055.71601: starting attempt loop 12180 1727204055.71653: running the handler 12180 1727204055.71673: _low_level_execute_command(): starting 12180 1727204055.71687: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204055.73559: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204055.73566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204055.73588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 12180 1727204055.73593: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204055.73780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204055.73846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204055.73850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204055.73951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 12180 1727204055.76166: stdout chunk (state=3): >>>/root <<< 12180 1727204055.76320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204055.76412: stderr chunk (state=3): >>><<< 12180 1727204055.76417: stdout chunk (state=3): >>><<< 12180 1727204055.76543: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 12180 1727204055.76554: _low_level_execute_command(): starting 12180 1727204055.76557: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204055.7644339-12509-195738381136858 `" && echo ansible-tmp-1727204055.7644339-12509-195738381136858="` echo /root/.ansible/tmp/ansible-tmp-1727204055.7644339-12509-195738381136858 `" ) && sleep 0' 12180 1727204055.78457: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204055.78753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204055.78772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204055.78789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204055.78838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204055.78853: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204055.78870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204055.78888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204055.78898: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204055.78907: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204055.78917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204055.78932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204055.78948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204055.78960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204055.78973: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204055.78986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204055.79065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204055.79290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204055.79308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204055.79400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204055.81520: stdout chunk (state=3): >>>ansible-tmp-1727204055.7644339-12509-195738381136858=/root/.ansible/tmp/ansible-tmp-1727204055.7644339-12509-195738381136858 <<< 12180 1727204055.81755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204055.81759: stdout chunk (state=3): >>><<< 12180 1727204055.81767: stderr chunk (state=3): >>><<< 12180 1727204055.81796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204055.7644339-12509-195738381136858=/root/.ansible/tmp/ansible-tmp-1727204055.7644339-12509-195738381136858 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204055.81848: variable 'ansible_module_compression' from source: unknown 12180 1727204055.81918: ANSIBALLZ: Using lock for stat 12180 1727204055.81921: ANSIBALLZ: Acquiring lock 12180 1727204055.81924: ANSIBALLZ: Lock acquired: 140650305862928 12180 1727204055.81929: ANSIBALLZ: Creating module 12180 1727204056.36439: ANSIBALLZ: Writing module into payload 12180 1727204056.36781: ANSIBALLZ: Writing module 12180 1727204056.36801: ANSIBALLZ: Renaming module 12180 1727204056.36806: ANSIBALLZ: Done creating module 12180 1727204056.36822: variable 'ansible_facts' from source: unknown 12180 1727204056.37114: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204055.7644339-12509-195738381136858/AnsiballZ_stat.py 12180 1727204056.38911: Sending initial data 12180 1727204056.38915: Sent initial data (153 bytes) 12180 1727204056.43401: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204056.43406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204056.43499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204056.43503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204056.43577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 12180 1727204056.43693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204056.43771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204056.43930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204056.43983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204056.45823: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204056.45877: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204056.45929: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpshvzqx52 /root/.ansible/tmp/ansible-tmp-1727204055.7644339-12509-195738381136858/AnsiballZ_stat.py <<< 12180 1727204056.45982: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204056.47319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204056.47505: stderr chunk (state=3): >>><<< 12180 1727204056.47513: stdout chunk (state=3): >>><<< 12180 1727204056.47516: done transferring module to remote 12180 1727204056.47519: _low_level_execute_command(): starting 12180 1727204056.47521: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204055.7644339-12509-195738381136858/ /root/.ansible/tmp/ansible-tmp-1727204055.7644339-12509-195738381136858/AnsiballZ_stat.py && sleep 0' 12180 1727204056.49773: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204056.49920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204056.49941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204056.49961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204056.50016: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204056.50139: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204056.50154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204056.50174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204056.50250: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204056.50265: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204056.50278: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204056.50292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204056.50308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204056.50321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204056.50343: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204056.50362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204056.50447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204056.50581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204056.50685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204056.50902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204056.52714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204056.52718: stdout chunk (state=3): >>><<< 12180 1727204056.52721: stderr chunk (state=3): >>><<< 12180 1727204056.52770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204056.52773: _low_level_execute_command(): starting 12180 1727204056.52776: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204055.7644339-12509-195738381136858/AnsiballZ_stat.py && sleep 0' 12180 1727204056.54450: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204056.54609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204056.54629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204056.54650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204056.54707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204056.54829: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204056.54846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204056.54868: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204056.54882: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204056.54894: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204056.54907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204056.54931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204056.54948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204056.54961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204056.54976: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204056.54990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204056.55107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204056.55273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204056.55288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204056.55581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204056.57511: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # <<< 12180 1727204056.57515: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 12180 1727204056.57579: stdout chunk (state=3): >>>import '_io' # <<< 12180 1727204056.57585: stdout chunk (state=3): >>>import 'marshal' # <<< 12180 1727204056.57604: stdout chunk (state=3): >>>import 'posix' # <<< 12180 1727204056.57643: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 12180 1727204056.57649: stdout chunk (state=3): >>># installing zipimport hook <<< 12180 1727204056.57689: stdout chunk (state=3): >>>import 'time' # <<< 12180 1727204056.57692: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 12180 1727204056.57759: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 12180 1727204056.57766: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204056.57770: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 12180 1727204056.57801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 12180 1727204056.57813: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c02243dc0> <<< 12180 1727204056.57849: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 12180 1727204056.57871: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01fd83a0> <<< 12180 1727204056.57903: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c02243b20> <<< 12180 1727204056.57907: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 12180 1727204056.57925: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c02243ac0> <<< 12180 1727204056.57959: stdout chunk (state=3): >>>import '_signal' # <<< 12180 1727204056.57963: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 12180 1727204056.57967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 12180 1727204056.57992: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01fd8490> <<< 12180 1727204056.57997: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 12180 1727204056.58030: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 12180 1727204056.58043: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01fd8940> <<< 12180 1727204056.58072: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01fd8670> <<< 12180 1727204056.58096: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 12180 1727204056.58113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 12180 1727204056.58126: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 12180 1727204056.58154: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 12180 1727204056.58171: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 12180 1727204056.58195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 12180 1727204056.58218: stdout chunk (state=3): >>>import '_stat' # <<< 12180 1727204056.58221: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f8f190> <<< 12180 1727204056.58238: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 12180 1727204056.58262: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 12180 1727204056.58330: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f8f220> <<< 12180 1727204056.58366: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 12180 1727204056.58390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 12180 1727204056.58393: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01fb2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f8f940> <<< 12180 1727204056.58423: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ff0880> <<< 12180 1727204056.58452: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 12180 1727204056.58455: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f88d90> <<< 12180 1727204056.58524: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py <<< 12180 1727204056.58530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # <<< 12180 1727204056.58533: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01fb2d90> <<< 12180 1727204056.58579: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01fd8970> <<< 12180 1727204056.58604: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12180 1727204056.58803: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 12180 1727204056.58846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 12180 1727204056.58849: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 12180 1727204056.58852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 12180 1727204056.58861: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 12180 1727204056.58886: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 12180 1727204056.58903: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 12180 1727204056.58918: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 12180 1727204056.58937: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f2df10> <<< 12180 1727204056.58979: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f340a0> <<< 12180 1727204056.58996: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 12180 1727204056.59010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 12180 1727204056.59019: stdout chunk (state=3): >>>import '_sre' # <<< 12180 1727204056.59052: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 12180 1727204056.59079: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 12180 1727204056.59159: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f275b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f2e6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f2d3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 12180 1727204056.59224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 12180 1727204056.59238: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 12180 1727204056.59278: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204056.59292: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 12180 1727204056.59327: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204056.59345: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01eb1eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01eb19a0> import 'itertools' # <<< 12180 1727204056.59365: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py <<< 12180 1727204056.59390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01eb1fa0> <<< 12180 1727204056.59404: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 12180 1727204056.59480: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01eb1df0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec1160> import '_collections' # <<< 12180 1727204056.59615: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f09e20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f01700> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 12180 1727204056.59635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f15760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f35eb0> <<< 12180 1727204056.59648: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 12180 1727204056.59686: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01ec1d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f09340> <<< 12180 1727204056.59725: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204056.59744: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01f15370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f3ba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 12180 1727204056.59758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 12180 1727204056.59835: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec1f40> <<< 12180 1727204056.59857: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec1e80> <<< 12180 1727204056.59874: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec1df0> <<< 12180 1727204056.59890: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 12180 1727204056.59944: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 12180 1727204056.59955: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 12180 1727204056.60051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01e95460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 12180 1727204056.60063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 12180 1727204056.60156: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01e95550> <<< 12180 1727204056.60213: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01e730d0> <<< 12180 1727204056.60270: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec4b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec44c0> <<< 12180 1727204056.60286: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 12180 1727204056.60296: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 12180 1727204056.60377: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 12180 1727204056.60388: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01daf2b0> <<< 12180 1727204056.60486: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01e80d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec4fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f3b0d0> <<< 12180 1727204056.60497: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 12180 1727204056.60604: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01dbfbe0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01dbff10> <<< 12180 1727204056.60619: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 12180 1727204056.60681: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01dd2820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 12180 1727204056.60710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 12180 1727204056.60743: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01dd2d60> <<< 12180 1727204056.60777: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d60490> <<< 12180 1727204056.60796: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01dbff40> <<< 12180 1727204056.60817: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 12180 1727204056.60880: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d70370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01dd26a0> import 'pwd' # <<< 12180 1727204056.60925: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d70430> <<< 12180 1727204056.61039: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec1ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 12180 1727204056.61056: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d8c790> <<< 12180 1727204056.61075: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 12180 1727204056.61147: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d8ca60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01d8c850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d8c940> <<< 12180 1727204056.61169: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 12180 1727204056.61181: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 12180 1727204056.61368: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204056.61389: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d8cd90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204056.61405: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d962e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01d8c9d0> <<< 12180 1727204056.61419: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01d80b20> <<< 12180 1727204056.61471: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec16a0> <<< 12180 1727204056.61483: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 12180 1727204056.61524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 12180 1727204056.61558: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01d8cb80> <<< 12180 1727204056.61660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 12180 1727204056.61676: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2c01cb5760> <<< 12180 1727204056.61845: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip' <<< 12180 1727204056.61856: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.61936: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.61972: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/__init__.py <<< 12180 1727204056.61988: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12180 1727204056.62009: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 12180 1727204056.62029: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.63240: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.64193: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bdc8b0> <<< 12180 1727204056.64221: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204056.64237: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 12180 1727204056.64256: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 12180 1727204056.64292: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204056.64296: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01bdc160> <<< 12180 1727204056.64331: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bdc280> <<< 12180 1727204056.64361: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bdc5e0> <<< 12180 1727204056.64383: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 12180 1727204056.64440: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bdc4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bdce20> <<< 12180 1727204056.64443: stdout chunk (state=3): >>>import 'atexit' # <<< 12180 1727204056.64470: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01bdc580> <<< 12180 1727204056.64481: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 12180 1727204056.64511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 12180 1727204056.64550: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bdc100> <<< 12180 1727204056.64568: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 12180 1727204056.64611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 12180 1727204056.64616: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 12180 1727204056.64635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 12180 1727204056.64652: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 12180 1727204056.64723: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c015edfd0> <<< 12180 1727204056.64759: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01b51c40> <<< 12180 1727204056.64790: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01b51f40> <<< 12180 1727204056.64809: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 12180 1727204056.64839: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 12180 1727204056.64888: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01b512e0> <<< 12180 1727204056.64891: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01c44d90> <<< 12180 1727204056.65074: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01c443a0> <<< 12180 1727204056.65080: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 12180 1727204056.65083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 12180 1727204056.65130: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01c44f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 12180 1727204056.65151: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 12180 1727204056.65182: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 12180 1727204056.65185: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 12180 1727204056.65222: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 12180 1727204056.65225: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01cb5a90> <<< 12180 1727204056.65300: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bafdc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01baf490> <<< 12180 1727204056.65329: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01be6580> <<< 12180 1727204056.65341: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01baf5b0> <<< 12180 1727204056.65379: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01baf5e0> <<< 12180 1727204056.65396: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 12180 1727204056.65407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 12180 1727204056.65436: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 12180 1727204056.65448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 12180 1727204056.65519: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c015def70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01c242e0> <<< 12180 1727204056.65559: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 12180 1727204056.65562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 12180 1727204056.65617: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c015db7f0> <<< 12180 1727204056.65621: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01c24460> <<< 12180 1727204056.65646: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 12180 1727204056.65705: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204056.65708: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 12180 1727204056.65710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 12180 1727204056.65712: stdout chunk (state=3): >>>import '_string' # <<< 12180 1727204056.65770: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01c3cf40> <<< 12180 1727204056.65897: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c015db790> <<< 12180 1727204056.65986: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c015db5e0> <<< 12180 1727204056.66025: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204056.66030: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c015da550> <<< 12180 1727204056.66074: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204056.66080: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c015da490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01c1d9a0> <<< 12180 1727204056.66099: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 12180 1727204056.66118: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 12180 1727204056.66131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 12180 1727204056.66178: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01ba56a0> <<< 12180 1727204056.66353: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01ba4bb0> <<< 12180 1727204056.66366: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bb50d0> <<< 12180 1727204056.66409: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204056.66412: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01ba5100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01be8c40> <<< 12180 1727204056.66448: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 12180 1727204056.66451: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.66525: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.66607: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.66611: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 12180 1727204056.66654: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.66657: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.66662: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 12180 1727204056.66675: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.66766: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.66852: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.67304: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.67780: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 12180 1727204056.67784: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 12180 1727204056.67790: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 12180 1727204056.67805: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204056.67866: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' <<< 12180 1727204056.67870: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c015a7940> <<< 12180 1727204056.67933: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 12180 1727204056.67939: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ba2d30> <<< 12180 1727204056.67951: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01b997c0> <<< 12180 1727204056.68013: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 12180 1727204056.68016: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.68044: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.68047: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 12180 1727204056.68170: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.68300: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 12180 1727204056.68303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 12180 1727204056.68318: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ba44c0> <<< 12180 1727204056.68340: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.68717: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.69083: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.69137: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.69205: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/collections.py <<< 12180 1727204056.69211: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.69242: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.69274: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py <<< 12180 1727204056.69286: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.69341: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.69430: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 12180 1727204056.69436: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.69441: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.69461: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 12180 1727204056.69486: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.69531: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 12180 1727204056.69534: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.69714: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.69908: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 12180 1727204056.69939: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 12180 1727204056.69942: stdout chunk (state=3): >>>import '_ast' # <<< 12180 1727204056.70019: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c0115d940> <<< 12180 1727204056.70022: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.70080: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.70160: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 12180 1727204056.70168: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 12180 1727204056.70184: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.70217: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.70267: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 12180 1727204056.70270: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.70294: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.70336: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.70421: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.70563: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 12180 1727204056.70586: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01c2fb50> <<< 12180 1727204056.70611: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01165fa0> <<< 12180 1727204056.70651: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 12180 1727204056.70674: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.70782: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.70830: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.70846: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.70907: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 12180 1727204056.70913: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 12180 1727204056.70925: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 12180 1727204056.70952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 12180 1727204056.70985: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 12180 1727204056.70988: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 12180 1727204056.71080: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c011ad6d0> <<< 12180 1727204056.71121: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c0159dc10> <<< 12180 1727204056.71182: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c0159c5b0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 12180 1727204056.71185: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.71208: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.71242: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 12180 1727204056.71312: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 12180 1727204056.71318: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.71343: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 12180 1727204056.71357: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.71465: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.71626: stdout chunk (state=3): >>># zipimport: zlib available <<< 12180 1727204056.71768: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 12180 1727204056.71994: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1<<< 12180 1727204056.71997: stdout chunk (state=3): >>> # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout <<< 12180 1727204056.72004: stdout chunk (state=3): >>># restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io <<< 12180 1727204056.72030: stdout chunk (state=3): >>># cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal <<< 12180 1727204056.72048: stdout chunk (state=3): >>># cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re <<< 12180 1727204056.72090: stdout chunk (state=3): >>># cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ <<< 12180 1727204056.72130: stdout chunk (state=3): >>># cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections <<< 12180 1727204056.72135: stdout chunk (state=3): >>># destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 12180 1727204056.72318: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 12180 1727204056.72325: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 12180 1727204056.72367: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 12180 1727204056.72374: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 12180 1727204056.72401: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid <<< 12180 1727204056.72407: stdout chunk (state=3): >>># destroy array # destroy datetime <<< 12180 1727204056.72428: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 12180 1727204056.72484: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 12180 1727204056.72521: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize <<< 12180 1727204056.72529: stdout chunk (state=3): >>># cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors <<< 12180 1727204056.72537: stdout chunk (state=3): >>># cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading <<< 12180 1727204056.72584: stdout chunk (state=3): >>># cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile <<< 12180 1727204056.72589: stdout chunk (state=3): >>># destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq <<< 12180 1727204056.72592: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools <<< 12180 1727204056.72658: stdout chunk (state=3): >>># cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 12180 1727204056.72661: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal <<< 12180 1727204056.72669: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 12180 1727204056.72671: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 12180 1727204056.72673: stdout chunk (state=3): >>># destroy systemd._daemon # destroy _socket <<< 12180 1727204056.72674: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 12180 1727204056.72823: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 12180 1727204056.72827: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq <<< 12180 1727204056.72868: stdout chunk (state=3): >>># destroy posixpath <<< 12180 1727204056.72871: stdout chunk (state=3): >>># destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 12180 1727204056.72873: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 12180 1727204056.72876: stdout chunk (state=3): >>># destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 12180 1727204056.72904: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 12180 1727204056.73255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204056.73258: stdout chunk (state=3): >>><<< 12180 1727204056.73260: stderr chunk (state=3): >>><<< 12180 1727204056.73347: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c02243dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01fd83a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c02243b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c02243ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01fd8490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01fd8940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01fd8670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f8f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f8f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01fb2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f8f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ff0880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f88d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01fb2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01fd8970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f2df10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f340a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f275b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f2e6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f2d3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01eb1eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01eb19a0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01eb1fa0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01eb1df0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec1160> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f09e20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f01700> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f15760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f35eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01ec1d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f09340> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01f15370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f3ba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec1f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec1e80> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec1df0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01e95460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01e95550> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01e730d0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec4b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec44c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01daf2b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01e80d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec4fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01f3b0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01dbfbe0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01dbff10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01dd2820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01dd2d60> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d60490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01dbff40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d70370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01dd26a0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d70430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec1ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d8c790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d8ca60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01d8c850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d8c940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d8cd90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01d962e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01d8c9d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01d80b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ec16a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01d8cb80> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2c01cb5760> # zipimport: found 30 names in '/tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bdc8b0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01bdc160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bdc280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bdc5e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bdc4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bdce20> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01bdc580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bdc100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c015edfd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01b51c40> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01b51f40> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01b512e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01c44d90> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01c443a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01c44f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01cb5a90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bafdc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01baf490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01be6580> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01baf5b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01baf5e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c015def70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01c242e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c015db7f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01c24460> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01c3cf40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c015db790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c015db5e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c015da550> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c015da490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01c1d9a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01ba56a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01ba4bb0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01bb50d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01ba5100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01be8c40> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c015a7940> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ba2d30> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01b997c0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01ba44c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c0115d940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2c01c2fb50> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c01165fa0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c011ad6d0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c0159dc10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2c0159c5b0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_elbkmts9/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 12180 1727204056.74069: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204055.7644339-12509-195738381136858/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204056.74072: _low_level_execute_command(): starting 12180 1727204056.74075: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204055.7644339-12509-195738381136858/ > /dev/null 2>&1 && sleep 0' 12180 1727204056.76834: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204056.76839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204056.76887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204056.77086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204056.77139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204056.77143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204056.77216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204056.79003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204056.79088: stderr chunk (state=3): >>><<< 12180 1727204056.79092: stdout chunk (state=3): >>><<< 12180 1727204056.79176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204056.79180: handler run complete 12180 1727204056.79183: attempt loop complete, returning result 12180 1727204056.79185: _execute() done 12180 1727204056.79188: dumping result to json 12180 1727204056.79191: done dumping result, returning 12180 1727204056.79193: done running TaskExecutor() for managed-node1/TASK: Check if system is ostree [0affcd87-79f5-ccb1-55ae-0000000000e0] 12180 1727204056.79195: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000e0 12180 1727204056.79439: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000e0 12180 1727204056.79443: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 12180 1727204056.79546: no more pending results, returning what we have 12180 1727204056.79550: results queue empty 12180 1727204056.79551: checking for any_errors_fatal 12180 1727204056.79560: done checking for any_errors_fatal 12180 1727204056.79560: checking for max_fail_percentage 12180 1727204056.79562: done checking for max_fail_percentage 12180 1727204056.79565: checking to see if all hosts have failed and the running result is not ok 12180 1727204056.79566: done checking to see if all hosts have failed 12180 1727204056.79567: getting the remaining hosts for this loop 12180 1727204056.79568: done getting the remaining hosts for this loop 12180 1727204056.79573: getting the next task for host managed-node1 12180 1727204056.79580: done getting next task for host managed-node1 12180 1727204056.79582: ^ task is: TASK: Set flag to indicate system is ostree 12180 1727204056.79585: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204056.79591: getting variables 12180 1727204056.79593: in VariableManager get_vars() 12180 1727204056.79627: Calling all_inventory to load vars for managed-node1 12180 1727204056.79630: Calling groups_inventory to load vars for managed-node1 12180 1727204056.79634: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204056.79645: Calling all_plugins_play to load vars for managed-node1 12180 1727204056.79647: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204056.79650: Calling groups_plugins_play to load vars for managed-node1 12180 1727204056.79930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204056.80240: done with get_vars() 12180 1727204056.80252: done getting variables 12180 1727204056.80410: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:54:16 -0400 (0:00:01.188) 0:00:04.217 ***** 12180 1727204056.80558: entering _queue_task() for managed-node1/set_fact 12180 1727204056.80560: Creating lock for set_fact 12180 1727204056.82682: worker is 1 (out of 1 available) 12180 1727204056.82695: exiting _queue_task() for managed-node1/set_fact 12180 1727204056.82706: done queuing things up, now waiting for results queue to drain 12180 1727204056.82708: waiting for pending results... 12180 1727204056.82980: running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree 12180 1727204056.83090: in run() - task 0affcd87-79f5-ccb1-55ae-0000000000e1 12180 1727204056.83104: variable 'ansible_search_path' from source: unknown 12180 1727204056.83107: variable 'ansible_search_path' from source: unknown 12180 1727204056.83158: calling self._execute() 12180 1727204056.83238: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204056.83250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204056.83261: variable 'omit' from source: magic vars 12180 1727204056.83768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204056.85080: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204056.85147: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204056.85224: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204056.85292: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204056.85614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204056.85855: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204056.85884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204056.85909: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204056.86910: Evaluated conditional (not __network_is_ostree is defined): True 12180 1727204056.86926: variable 'omit' from source: magic vars 12180 1727204056.86974: variable 'omit' from source: magic vars 12180 1727204056.87393: variable '__ostree_booted_stat' from source: set_fact 12180 1727204056.87510: variable 'omit' from source: magic vars 12180 1727204056.87741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204056.87883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204056.89156: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204056.89189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204056.89255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204056.89290: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204056.89466: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204056.89480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204056.89716: Set connection var ansible_pipelining to False 12180 1727204056.89805: Set connection var ansible_shell_type to sh 12180 1727204056.89909: Set connection var ansible_timeout to 10 12180 1727204056.89924: Set connection var ansible_connection to ssh 12180 1727204056.90016: Set connection var ansible_shell_executable to /bin/sh 12180 1727204056.90033: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204056.90069: variable 'ansible_shell_executable' from source: unknown 12180 1727204056.90114: variable 'ansible_connection' from source: unknown 12180 1727204056.90221: variable 'ansible_module_compression' from source: unknown 12180 1727204056.90239: variable 'ansible_shell_type' from source: unknown 12180 1727204056.90247: variable 'ansible_shell_executable' from source: unknown 12180 1727204056.90258: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204056.90331: variable 'ansible_pipelining' from source: unknown 12180 1727204056.90346: variable 'ansible_timeout' from source: unknown 12180 1727204056.90357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204056.90700: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204056.90715: variable 'omit' from source: magic vars 12180 1727204056.90722: starting attempt loop 12180 1727204056.90727: running the handler 12180 1727204056.90776: handler run complete 12180 1727204056.90788: attempt loop complete, returning result 12180 1727204056.90883: _execute() done 12180 1727204056.90981: dumping result to json 12180 1727204056.90997: done dumping result, returning 12180 1727204056.91009: done running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree [0affcd87-79f5-ccb1-55ae-0000000000e1] 12180 1727204056.91019: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000e1 ok: [managed-node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 12180 1727204056.91177: no more pending results, returning what we have 12180 1727204056.91180: results queue empty 12180 1727204056.91181: checking for any_errors_fatal 12180 1727204056.91188: done checking for any_errors_fatal 12180 1727204056.91189: checking for max_fail_percentage 12180 1727204056.91191: done checking for max_fail_percentage 12180 1727204056.91192: checking to see if all hosts have failed and the running result is not ok 12180 1727204056.91193: done checking to see if all hosts have failed 12180 1727204056.91193: getting the remaining hosts for this loop 12180 1727204056.91195: done getting the remaining hosts for this loop 12180 1727204056.91199: getting the next task for host managed-node1 12180 1727204056.91212: done getting next task for host managed-node1 12180 1727204056.91215: ^ task is: TASK: Fix CentOS6 Base repo 12180 1727204056.91218: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204056.91221: getting variables 12180 1727204056.91226: in VariableManager get_vars() 12180 1727204056.91257: Calling all_inventory to load vars for managed-node1 12180 1727204056.91259: Calling groups_inventory to load vars for managed-node1 12180 1727204056.91263: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204056.91276: Calling all_plugins_play to load vars for managed-node1 12180 1727204056.91278: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204056.91281: Calling groups_plugins_play to load vars for managed-node1 12180 1727204056.91462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204056.91660: done with get_vars() 12180 1727204056.91674: done getting variables 12180 1727204056.92261: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 12180 1727204056.92291: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000e1 12180 1727204056.92294: WORKER PROCESS EXITING TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.117) 0:00:04.335 ***** 12180 1727204056.92311: entering _queue_task() for managed-node1/copy 12180 1727204056.94116: worker is 1 (out of 1 available) 12180 1727204056.94132: exiting _queue_task() for managed-node1/copy 12180 1727204056.94145: done queuing things up, now waiting for results queue to drain 12180 1727204056.94147: waiting for pending results... 12180 1727204056.95662: running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo 12180 1727204056.96007: in run() - task 0affcd87-79f5-ccb1-55ae-0000000000e3 12180 1727204056.96084: variable 'ansible_search_path' from source: unknown 12180 1727204056.96091: variable 'ansible_search_path' from source: unknown 12180 1727204056.96129: calling self._execute() 12180 1727204056.96445: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204056.96507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204056.96524: variable 'omit' from source: magic vars 12180 1727204056.98143: variable 'ansible_distribution' from source: facts 12180 1727204056.98258: Evaluated conditional (ansible_distribution == 'CentOS'): True 12180 1727204056.98547: variable 'ansible_distribution_major_version' from source: facts 12180 1727204056.98730: Evaluated conditional (ansible_distribution_major_version == '6'): False 12180 1727204056.98739: when evaluation is False, skipping this task 12180 1727204056.98778: _execute() done 12180 1727204056.98846: dumping result to json 12180 1727204056.98855: done dumping result, returning 12180 1727204056.98914: done running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo [0affcd87-79f5-ccb1-55ae-0000000000e3] 12180 1727204056.98925: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000e3 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 12180 1727204056.99284: no more pending results, returning what we have 12180 1727204056.99288: results queue empty 12180 1727204056.99289: checking for any_errors_fatal 12180 1727204056.99297: done checking for any_errors_fatal 12180 1727204056.99297: checking for max_fail_percentage 12180 1727204056.99299: done checking for max_fail_percentage 12180 1727204056.99300: checking to see if all hosts have failed and the running result is not ok 12180 1727204056.99301: done checking to see if all hosts have failed 12180 1727204056.99301: getting the remaining hosts for this loop 12180 1727204056.99303: done getting the remaining hosts for this loop 12180 1727204056.99306: getting the next task for host managed-node1 12180 1727204056.99314: done getting next task for host managed-node1 12180 1727204056.99317: ^ task is: TASK: Include the task 'enable_epel.yml' 12180 1727204056.99320: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204056.99324: getting variables 12180 1727204056.99326: in VariableManager get_vars() 12180 1727204056.99356: Calling all_inventory to load vars for managed-node1 12180 1727204056.99359: Calling groups_inventory to load vars for managed-node1 12180 1727204056.99362: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204056.99378: Calling all_plugins_play to load vars for managed-node1 12180 1727204056.99380: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204056.99383: Calling groups_plugins_play to load vars for managed-node1 12180 1727204056.99823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204057.00428: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000e3 12180 1727204057.00431: WORKER PROCESS EXITING 12180 1727204057.00518: done with get_vars() 12180 1727204057.00532: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.089) 0:00:04.424 ***** 12180 1727204057.01287: entering _queue_task() for managed-node1/include_tasks 12180 1727204057.02344: worker is 1 (out of 1 available) 12180 1727204057.02361: exiting _queue_task() for managed-node1/include_tasks 12180 1727204057.02379: done queuing things up, now waiting for results queue to drain 12180 1727204057.02381: waiting for pending results... 12180 1727204057.04462: running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' 12180 1727204057.04913: in run() - task 0affcd87-79f5-ccb1-55ae-0000000000e4 12180 1727204057.05048: variable 'ansible_search_path' from source: unknown 12180 1727204057.05056: variable 'ansible_search_path' from source: unknown 12180 1727204057.05100: calling self._execute() 12180 1727204057.05188: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204057.05255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204057.05272: variable 'omit' from source: magic vars 12180 1727204057.06347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204057.13923: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204057.14332: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204057.14511: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204057.14692: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204057.14731: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204057.15398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204057.15658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204057.15891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204057.16078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204057.16160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204057.17154: variable '__network_is_ostree' from source: set_fact 12180 1727204057.17252: Evaluated conditional (not __network_is_ostree | d(false)): True 12180 1727204057.17275: _execute() done 12180 1727204057.17280: dumping result to json 12180 1727204057.17283: done dumping result, returning 12180 1727204057.17286: done running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' [0affcd87-79f5-ccb1-55ae-0000000000e4] 12180 1727204057.17291: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000e4 12180 1727204057.17528: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000e4 12180 1727204057.17570: no more pending results, returning what we have 12180 1727204057.17578: in VariableManager get_vars() 12180 1727204057.17617: Calling all_inventory to load vars for managed-node1 12180 1727204057.17621: Calling groups_inventory to load vars for managed-node1 12180 1727204057.17625: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204057.17637: Calling all_plugins_play to load vars for managed-node1 12180 1727204057.17640: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204057.17643: Calling groups_plugins_play to load vars for managed-node1 12180 1727204057.17876: WORKER PROCESS EXITING 12180 1727204057.17891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204057.18201: done with get_vars() 12180 1727204057.18210: variable 'ansible_search_path' from source: unknown 12180 1727204057.18211: variable 'ansible_search_path' from source: unknown 12180 1727204057.18535: we have included files to process 12180 1727204057.18536: generating all_blocks data 12180 1727204057.18538: done generating all_blocks data 12180 1727204057.18542: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12180 1727204057.18543: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12180 1727204057.18545: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12180 1727204057.21451: done processing included file 12180 1727204057.21454: iterating over new_blocks loaded from include file 12180 1727204057.21456: in VariableManager get_vars() 12180 1727204057.21472: done with get_vars() 12180 1727204057.21474: filtering new block on tags 12180 1727204057.21519: done filtering new block on tags 12180 1727204057.21524: in VariableManager get_vars() 12180 1727204057.21604: done with get_vars() 12180 1727204057.21607: filtering new block on tags 12180 1727204057.21621: done filtering new block on tags 12180 1727204057.21623: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node1 12180 1727204057.21630: extending task lists for all hosts with included blocks 12180 1727204057.21977: done extending task lists 12180 1727204057.21979: done processing included files 12180 1727204057.21980: results queue empty 12180 1727204057.21981: checking for any_errors_fatal 12180 1727204057.21984: done checking for any_errors_fatal 12180 1727204057.21985: checking for max_fail_percentage 12180 1727204057.21986: done checking for max_fail_percentage 12180 1727204057.21987: checking to see if all hosts have failed and the running result is not ok 12180 1727204057.21987: done checking to see if all hosts have failed 12180 1727204057.21988: getting the remaining hosts for this loop 12180 1727204057.21989: done getting the remaining hosts for this loop 12180 1727204057.21992: getting the next task for host managed-node1 12180 1727204057.22056: done getting next task for host managed-node1 12180 1727204057.22059: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 12180 1727204057.22062: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204057.22067: getting variables 12180 1727204057.22068: in VariableManager get_vars() 12180 1727204057.22165: Calling all_inventory to load vars for managed-node1 12180 1727204057.22170: Calling groups_inventory to load vars for managed-node1 12180 1727204057.22173: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204057.22178: Calling all_plugins_play to load vars for managed-node1 12180 1727204057.22186: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204057.22189: Calling groups_plugins_play to load vars for managed-node1 12180 1727204057.22648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204057.23209: done with get_vars() 12180 1727204057.23219: done getting variables 12180 1727204057.23642: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 12180 1727204057.24830: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.236) 0:00:04.661 ***** 12180 1727204057.24955: entering _queue_task() for managed-node1/command 12180 1727204057.24957: Creating lock for command 12180 1727204057.25736: worker is 1 (out of 1 available) 12180 1727204057.25749: exiting _queue_task() for managed-node1/command 12180 1727204057.25761: done queuing things up, now waiting for results queue to drain 12180 1727204057.25945: waiting for pending results... 12180 1727204057.26877: running TaskExecutor() for managed-node1/TASK: Create EPEL 9 12180 1727204057.27137: in run() - task 0affcd87-79f5-ccb1-55ae-0000000000fe 12180 1727204057.27220: variable 'ansible_search_path' from source: unknown 12180 1727204057.27228: variable 'ansible_search_path' from source: unknown 12180 1727204057.27278: calling self._execute() 12180 1727204057.27493: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204057.27505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204057.27570: variable 'omit' from source: magic vars 12180 1727204057.28367: variable 'ansible_distribution' from source: facts 12180 1727204057.28420: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12180 1727204057.28734: variable 'ansible_distribution_major_version' from source: facts 12180 1727204057.28745: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12180 1727204057.28753: when evaluation is False, skipping this task 12180 1727204057.28761: _execute() done 12180 1727204057.28771: dumping result to json 12180 1727204057.28785: done dumping result, returning 12180 1727204057.28796: done running TaskExecutor() for managed-node1/TASK: Create EPEL 9 [0affcd87-79f5-ccb1-55ae-0000000000fe] 12180 1727204057.28846: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000fe skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12180 1727204057.29081: no more pending results, returning what we have 12180 1727204057.29084: results queue empty 12180 1727204057.29085: checking for any_errors_fatal 12180 1727204057.29087: done checking for any_errors_fatal 12180 1727204057.29087: checking for max_fail_percentage 12180 1727204057.29089: done checking for max_fail_percentage 12180 1727204057.29090: checking to see if all hosts have failed and the running result is not ok 12180 1727204057.29091: done checking to see if all hosts have failed 12180 1727204057.29093: getting the remaining hosts for this loop 12180 1727204057.29094: done getting the remaining hosts for this loop 12180 1727204057.29098: getting the next task for host managed-node1 12180 1727204057.29107: done getting next task for host managed-node1 12180 1727204057.29109: ^ task is: TASK: Install yum-utils package 12180 1727204057.29116: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204057.29120: getting variables 12180 1727204057.29122: in VariableManager get_vars() 12180 1727204057.29198: Calling all_inventory to load vars for managed-node1 12180 1727204057.29202: Calling groups_inventory to load vars for managed-node1 12180 1727204057.29207: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204057.29220: Calling all_plugins_play to load vars for managed-node1 12180 1727204057.29222: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204057.29224: Calling groups_plugins_play to load vars for managed-node1 12180 1727204057.29385: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000fe 12180 1727204057.29389: WORKER PROCESS EXITING 12180 1727204057.29423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204057.29660: done with get_vars() 12180 1727204057.29675: done getting variables 12180 1727204057.30049: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.052) 0:00:04.713 ***** 12180 1727204057.30224: entering _queue_task() for managed-node1/package 12180 1727204057.30229: Creating lock for package 12180 1727204057.31741: worker is 1 (out of 1 available) 12180 1727204057.31753: exiting _queue_task() for managed-node1/package 12180 1727204057.31767: done queuing things up, now waiting for results queue to drain 12180 1727204057.31769: waiting for pending results... 12180 1727204057.32093: running TaskExecutor() for managed-node1/TASK: Install yum-utils package 12180 1727204057.32231: in run() - task 0affcd87-79f5-ccb1-55ae-0000000000ff 12180 1727204057.32260: variable 'ansible_search_path' from source: unknown 12180 1727204057.32271: variable 'ansible_search_path' from source: unknown 12180 1727204057.32314: calling self._execute() 12180 1727204057.32412: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204057.32423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204057.32443: variable 'omit' from source: magic vars 12180 1727204057.33167: variable 'ansible_distribution' from source: facts 12180 1727204057.33188: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12180 1727204057.33376: variable 'ansible_distribution_major_version' from source: facts 12180 1727204057.33423: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12180 1727204057.33434: when evaluation is False, skipping this task 12180 1727204057.33442: _execute() done 12180 1727204057.33449: dumping result to json 12180 1727204057.33456: done dumping result, returning 12180 1727204057.33478: done running TaskExecutor() for managed-node1/TASK: Install yum-utils package [0affcd87-79f5-ccb1-55ae-0000000000ff] 12180 1727204057.33545: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000ff skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12180 1727204057.33729: no more pending results, returning what we have 12180 1727204057.33733: results queue empty 12180 1727204057.33734: checking for any_errors_fatal 12180 1727204057.33740: done checking for any_errors_fatal 12180 1727204057.33740: checking for max_fail_percentage 12180 1727204057.33742: done checking for max_fail_percentage 12180 1727204057.33743: checking to see if all hosts have failed and the running result is not ok 12180 1727204057.33744: done checking to see if all hosts have failed 12180 1727204057.33745: getting the remaining hosts for this loop 12180 1727204057.33747: done getting the remaining hosts for this loop 12180 1727204057.33751: getting the next task for host managed-node1 12180 1727204057.33758: done getting next task for host managed-node1 12180 1727204057.33761: ^ task is: TASK: Enable EPEL 7 12180 1727204057.33768: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204057.33773: getting variables 12180 1727204057.33775: in VariableManager get_vars() 12180 1727204057.33876: Calling all_inventory to load vars for managed-node1 12180 1727204057.33880: Calling groups_inventory to load vars for managed-node1 12180 1727204057.33885: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204057.33898: Calling all_plugins_play to load vars for managed-node1 12180 1727204057.33901: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204057.33904: Calling groups_plugins_play to load vars for managed-node1 12180 1727204057.34179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204057.34391: done with get_vars() 12180 1727204057.34402: done getting variables 12180 1727204057.34485: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.043) 0:00:04.757 ***** 12180 1727204057.34522: entering _queue_task() for managed-node1/command 12180 1727204057.34546: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000ff 12180 1727204057.34557: WORKER PROCESS EXITING 12180 1727204057.35048: worker is 1 (out of 1 available) 12180 1727204057.35061: exiting _queue_task() for managed-node1/command 12180 1727204057.35074: done queuing things up, now waiting for results queue to drain 12180 1727204057.35075: waiting for pending results... 12180 1727204057.35709: running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 12180 1727204057.35834: in run() - task 0affcd87-79f5-ccb1-55ae-000000000100 12180 1727204057.35854: variable 'ansible_search_path' from source: unknown 12180 1727204057.35862: variable 'ansible_search_path' from source: unknown 12180 1727204057.35917: calling self._execute() 12180 1727204057.36000: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204057.36139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204057.36154: variable 'omit' from source: magic vars 12180 1727204057.37291: variable 'ansible_distribution' from source: facts 12180 1727204057.37346: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12180 1727204057.37717: variable 'ansible_distribution_major_version' from source: facts 12180 1727204057.37734: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12180 1727204057.37771: when evaluation is False, skipping this task 12180 1727204057.37779: _execute() done 12180 1727204057.37787: dumping result to json 12180 1727204057.37795: done dumping result, returning 12180 1727204057.37882: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 [0affcd87-79f5-ccb1-55ae-000000000100] 12180 1727204057.37896: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000100 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12180 1727204057.38069: no more pending results, returning what we have 12180 1727204057.38073: results queue empty 12180 1727204057.38074: checking for any_errors_fatal 12180 1727204057.38082: done checking for any_errors_fatal 12180 1727204057.38083: checking for max_fail_percentage 12180 1727204057.38085: done checking for max_fail_percentage 12180 1727204057.38086: checking to see if all hosts have failed and the running result is not ok 12180 1727204057.38087: done checking to see if all hosts have failed 12180 1727204057.38087: getting the remaining hosts for this loop 12180 1727204057.38089: done getting the remaining hosts for this loop 12180 1727204057.38093: getting the next task for host managed-node1 12180 1727204057.38101: done getting next task for host managed-node1 12180 1727204057.38104: ^ task is: TASK: Enable EPEL 8 12180 1727204057.38109: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204057.38113: getting variables 12180 1727204057.38115: in VariableManager get_vars() 12180 1727204057.38152: Calling all_inventory to load vars for managed-node1 12180 1727204057.38155: Calling groups_inventory to load vars for managed-node1 12180 1727204057.38159: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204057.38176: Calling all_plugins_play to load vars for managed-node1 12180 1727204057.38179: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204057.38183: Calling groups_plugins_play to load vars for managed-node1 12180 1727204057.38375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204057.38581: done with get_vars() 12180 1727204057.38593: done getting variables 12180 1727204057.38675: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.041) 0:00:04.799 ***** 12180 1727204057.38712: entering _queue_task() for managed-node1/command 12180 1727204057.38736: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000100 12180 1727204057.38746: WORKER PROCESS EXITING 12180 1727204057.39443: worker is 1 (out of 1 available) 12180 1727204057.39454: exiting _queue_task() for managed-node1/command 12180 1727204057.39618: done queuing things up, now waiting for results queue to drain 12180 1727204057.39621: waiting for pending results... 12180 1727204057.40016: running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 12180 1727204057.40190: in run() - task 0affcd87-79f5-ccb1-55ae-000000000101 12180 1727204057.40440: variable 'ansible_search_path' from source: unknown 12180 1727204057.40448: variable 'ansible_search_path' from source: unknown 12180 1727204057.40490: calling self._execute() 12180 1727204057.40833: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204057.40846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204057.40861: variable 'omit' from source: magic vars 12180 1727204057.41919: variable 'ansible_distribution' from source: facts 12180 1727204057.42196: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12180 1727204057.42468: variable 'ansible_distribution_major_version' from source: facts 12180 1727204057.42481: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 12180 1727204057.42489: when evaluation is False, skipping this task 12180 1727204057.42495: _execute() done 12180 1727204057.42502: dumping result to json 12180 1727204057.42510: done dumping result, returning 12180 1727204057.42519: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 [0affcd87-79f5-ccb1-55ae-000000000101] 12180 1727204057.42533: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000101 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 12180 1727204057.42688: no more pending results, returning what we have 12180 1727204057.42692: results queue empty 12180 1727204057.42693: checking for any_errors_fatal 12180 1727204057.42701: done checking for any_errors_fatal 12180 1727204057.42702: checking for max_fail_percentage 12180 1727204057.42704: done checking for max_fail_percentage 12180 1727204057.42705: checking to see if all hosts have failed and the running result is not ok 12180 1727204057.42706: done checking to see if all hosts have failed 12180 1727204057.42707: getting the remaining hosts for this loop 12180 1727204057.42708: done getting the remaining hosts for this loop 12180 1727204057.42712: getting the next task for host managed-node1 12180 1727204057.42723: done getting next task for host managed-node1 12180 1727204057.42726: ^ task is: TASK: Enable EPEL 6 12180 1727204057.42733: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204057.42737: getting variables 12180 1727204057.42739: in VariableManager get_vars() 12180 1727204057.42823: Calling all_inventory to load vars for managed-node1 12180 1727204057.42826: Calling groups_inventory to load vars for managed-node1 12180 1727204057.42833: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204057.42847: Calling all_plugins_play to load vars for managed-node1 12180 1727204057.42850: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204057.42853: Calling groups_plugins_play to load vars for managed-node1 12180 1727204057.43021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204057.43236: done with get_vars() 12180 1727204057.43247: done getting variables 12180 1727204057.43680: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000101 12180 1727204057.43684: WORKER PROCESS EXITING 12180 1727204057.43720: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.050) 0:00:04.849 ***** 12180 1727204057.43756: entering _queue_task() for managed-node1/copy 12180 1727204057.44158: worker is 1 (out of 1 available) 12180 1727204057.44380: exiting _queue_task() for managed-node1/copy 12180 1727204057.44393: done queuing things up, now waiting for results queue to drain 12180 1727204057.44394: waiting for pending results... 12180 1727204057.45059: running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 12180 1727204057.45356: in run() - task 0affcd87-79f5-ccb1-55ae-000000000103 12180 1727204057.45501: variable 'ansible_search_path' from source: unknown 12180 1727204057.45510: variable 'ansible_search_path' from source: unknown 12180 1727204057.45552: calling self._execute() 12180 1727204057.45652: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204057.45666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204057.45682: variable 'omit' from source: magic vars 12180 1727204057.46216: variable 'ansible_distribution' from source: facts 12180 1727204057.46240: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 12180 1727204057.46411: variable 'ansible_distribution_major_version' from source: facts 12180 1727204057.46422: Evaluated conditional (ansible_distribution_major_version == '6'): False 12180 1727204057.46433: when evaluation is False, skipping this task 12180 1727204057.46440: _execute() done 12180 1727204057.46447: dumping result to json 12180 1727204057.46454: done dumping result, returning 12180 1727204057.46477: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 [0affcd87-79f5-ccb1-55ae-000000000103] 12180 1727204057.46493: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000103 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 12180 1727204057.46655: no more pending results, returning what we have 12180 1727204057.46659: results queue empty 12180 1727204057.46660: checking for any_errors_fatal 12180 1727204057.46668: done checking for any_errors_fatal 12180 1727204057.46669: checking for max_fail_percentage 12180 1727204057.46672: done checking for max_fail_percentage 12180 1727204057.46673: checking to see if all hosts have failed and the running result is not ok 12180 1727204057.46674: done checking to see if all hosts have failed 12180 1727204057.46675: getting the remaining hosts for this loop 12180 1727204057.46676: done getting the remaining hosts for this loop 12180 1727204057.46680: getting the next task for host managed-node1 12180 1727204057.46692: done getting next task for host managed-node1 12180 1727204057.46694: ^ task is: TASK: Set network provider to 'nm' 12180 1727204057.46697: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204057.46703: getting variables 12180 1727204057.46704: in VariableManager get_vars() 12180 1727204057.46741: Calling all_inventory to load vars for managed-node1 12180 1727204057.46744: Calling groups_inventory to load vars for managed-node1 12180 1727204057.46748: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204057.46762: Calling all_plugins_play to load vars for managed-node1 12180 1727204057.46767: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204057.46771: Calling groups_plugins_play to load vars for managed-node1 12180 1727204057.46957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204057.47286: done with get_vars() 12180 1727204057.47298: done getting variables 12180 1727204057.47383: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:13 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.036) 0:00:04.886 ***** 12180 1727204057.47417: entering _queue_task() for managed-node1/set_fact 12180 1727204057.47440: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000103 12180 1727204057.47449: WORKER PROCESS EXITING 12180 1727204057.47920: worker is 1 (out of 1 available) 12180 1727204057.47934: exiting _queue_task() for managed-node1/set_fact 12180 1727204057.47945: done queuing things up, now waiting for results queue to drain 12180 1727204057.47946: waiting for pending results... 12180 1727204057.48483: running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' 12180 1727204057.48607: in run() - task 0affcd87-79f5-ccb1-55ae-000000000007 12180 1727204057.48630: variable 'ansible_search_path' from source: unknown 12180 1727204057.49023: calling self._execute() 12180 1727204057.49220: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204057.49236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204057.49251: variable 'omit' from source: magic vars 12180 1727204057.49370: variable 'omit' from source: magic vars 12180 1727204057.49403: variable 'omit' from source: magic vars 12180 1727204057.49476: variable 'omit' from source: magic vars 12180 1727204057.49556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204057.49603: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204057.49636: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204057.49667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204057.49683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204057.49717: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204057.49726: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204057.49737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204057.49849: Set connection var ansible_pipelining to False 12180 1727204057.49858: Set connection var ansible_shell_type to sh 12180 1727204057.49879: Set connection var ansible_timeout to 10 12180 1727204057.49890: Set connection var ansible_connection to ssh 12180 1727204057.49899: Set connection var ansible_shell_executable to /bin/sh 12180 1727204057.49910: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204057.50004: variable 'ansible_shell_executable' from source: unknown 12180 1727204057.50014: variable 'ansible_connection' from source: unknown 12180 1727204057.50021: variable 'ansible_module_compression' from source: unknown 12180 1727204057.50030: variable 'ansible_shell_type' from source: unknown 12180 1727204057.50038: variable 'ansible_shell_executable' from source: unknown 12180 1727204057.50045: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204057.50090: variable 'ansible_pipelining' from source: unknown 12180 1727204057.50101: variable 'ansible_timeout' from source: unknown 12180 1727204057.50110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204057.50437: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204057.50454: variable 'omit' from source: magic vars 12180 1727204057.50466: starting attempt loop 12180 1727204057.50474: running the handler 12180 1727204057.50490: handler run complete 12180 1727204057.50504: attempt loop complete, returning result 12180 1727204057.50512: _execute() done 12180 1727204057.50518: dumping result to json 12180 1727204057.50525: done dumping result, returning 12180 1727204057.50546: done running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' [0affcd87-79f5-ccb1-55ae-000000000007] 12180 1727204057.50557: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000007 ok: [managed-node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 12180 1727204057.50787: no more pending results, returning what we have 12180 1727204057.50791: results queue empty 12180 1727204057.50792: checking for any_errors_fatal 12180 1727204057.50798: done checking for any_errors_fatal 12180 1727204057.50799: checking for max_fail_percentage 12180 1727204057.50801: done checking for max_fail_percentage 12180 1727204057.50802: checking to see if all hosts have failed and the running result is not ok 12180 1727204057.50802: done checking to see if all hosts have failed 12180 1727204057.50803: getting the remaining hosts for this loop 12180 1727204057.50806: done getting the remaining hosts for this loop 12180 1727204057.50810: getting the next task for host managed-node1 12180 1727204057.50819: done getting next task for host managed-node1 12180 1727204057.50821: ^ task is: TASK: meta (flush_handlers) 12180 1727204057.50823: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204057.50831: getting variables 12180 1727204057.50833: in VariableManager get_vars() 12180 1727204057.50859: Calling all_inventory to load vars for managed-node1 12180 1727204057.50862: Calling groups_inventory to load vars for managed-node1 12180 1727204057.50867: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204057.50900: Calling all_plugins_play to load vars for managed-node1 12180 1727204057.50903: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204057.50906: Calling groups_plugins_play to load vars for managed-node1 12180 1727204057.51090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204057.51311: done with get_vars() 12180 1727204057.51329: done getting variables 12180 1727204057.51402: in VariableManager get_vars() 12180 1727204057.51411: Calling all_inventory to load vars for managed-node1 12180 1727204057.51413: Calling groups_inventory to load vars for managed-node1 12180 1727204057.51416: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204057.51420: Calling all_plugins_play to load vars for managed-node1 12180 1727204057.51422: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204057.51425: Calling groups_plugins_play to load vars for managed-node1 12180 1727204057.51653: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000007 12180 1727204057.51657: WORKER PROCESS EXITING 12180 1727204057.51913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204057.52148: done with get_vars() 12180 1727204057.52162: done queuing things up, now waiting for results queue to drain 12180 1727204057.52166: results queue empty 12180 1727204057.52167: checking for any_errors_fatal 12180 1727204057.52169: done checking for any_errors_fatal 12180 1727204057.52170: checking for max_fail_percentage 12180 1727204057.52171: done checking for max_fail_percentage 12180 1727204057.52172: checking to see if all hosts have failed and the running result is not ok 12180 1727204057.52173: done checking to see if all hosts have failed 12180 1727204057.52174: getting the remaining hosts for this loop 12180 1727204057.52174: done getting the remaining hosts for this loop 12180 1727204057.52178: getting the next task for host managed-node1 12180 1727204057.52182: done getting next task for host managed-node1 12180 1727204057.52184: ^ task is: TASK: meta (flush_handlers) 12180 1727204057.52185: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204057.52242: getting variables 12180 1727204057.52244: in VariableManager get_vars() 12180 1727204057.52253: Calling all_inventory to load vars for managed-node1 12180 1727204057.52256: Calling groups_inventory to load vars for managed-node1 12180 1727204057.52258: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204057.52262: Calling all_plugins_play to load vars for managed-node1 12180 1727204057.52266: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204057.52269: Calling groups_plugins_play to load vars for managed-node1 12180 1727204057.52436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204057.52743: done with get_vars() 12180 1727204057.52755: done getting variables 12180 1727204057.52804: in VariableManager get_vars() 12180 1727204057.52814: Calling all_inventory to load vars for managed-node1 12180 1727204057.52816: Calling groups_inventory to load vars for managed-node1 12180 1727204057.52818: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204057.52822: Calling all_plugins_play to load vars for managed-node1 12180 1727204057.52825: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204057.52831: Calling groups_plugins_play to load vars for managed-node1 12180 1727204057.53081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204057.53352: done with get_vars() 12180 1727204057.53366: done queuing things up, now waiting for results queue to drain 12180 1727204057.53368: results queue empty 12180 1727204057.53369: checking for any_errors_fatal 12180 1727204057.53370: done checking for any_errors_fatal 12180 1727204057.53371: checking for max_fail_percentage 12180 1727204057.53373: done checking for max_fail_percentage 12180 1727204057.53373: checking to see if all hosts have failed and the running result is not ok 12180 1727204057.53374: done checking to see if all hosts have failed 12180 1727204057.53375: getting the remaining hosts for this loop 12180 1727204057.53376: done getting the remaining hosts for this loop 12180 1727204057.53378: getting the next task for host managed-node1 12180 1727204057.53381: done getting next task for host managed-node1 12180 1727204057.53382: ^ task is: None 12180 1727204057.53384: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204057.53385: done queuing things up, now waiting for results queue to drain 12180 1727204057.53386: results queue empty 12180 1727204057.53386: checking for any_errors_fatal 12180 1727204057.53387: done checking for any_errors_fatal 12180 1727204057.53388: checking for max_fail_percentage 12180 1727204057.53389: done checking for max_fail_percentage 12180 1727204057.53389: checking to see if all hosts have failed and the running result is not ok 12180 1727204057.53390: done checking to see if all hosts have failed 12180 1727204057.53392: getting the next task for host managed-node1 12180 1727204057.53394: done getting next task for host managed-node1 12180 1727204057.53395: ^ task is: None 12180 1727204057.53396: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204057.53524: in VariableManager get_vars() 12180 1727204057.53552: done with get_vars() 12180 1727204057.53559: in VariableManager get_vars() 12180 1727204057.53576: done with get_vars() 12180 1727204057.53581: variable 'omit' from source: magic vars 12180 1727204057.53613: in VariableManager get_vars() 12180 1727204057.53637: done with get_vars() 12180 1727204057.53661: variable 'omit' from source: magic vars PLAY [Play for testing bond device using deprecated 'master' argument] ********* 12180 1727204057.54931: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12180 1727204057.55726: getting the remaining hosts for this loop 12180 1727204057.55732: done getting the remaining hosts for this loop 12180 1727204057.55738: getting the next task for host managed-node1 12180 1727204057.55836: done getting next task for host managed-node1 12180 1727204057.55840: ^ task is: TASK: Gathering Facts 12180 1727204057.55841: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204057.55844: getting variables 12180 1727204057.55845: in VariableManager get_vars() 12180 1727204057.55873: Calling all_inventory to load vars for managed-node1 12180 1727204057.55875: Calling groups_inventory to load vars for managed-node1 12180 1727204057.55877: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204057.55883: Calling all_plugins_play to load vars for managed-node1 12180 1727204057.55895: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204057.55898: Calling groups_plugins_play to load vars for managed-node1 12180 1727204057.56090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204057.56476: done with get_vars() 12180 1727204057.56485: done getting variables 12180 1727204057.56541: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:3 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.091) 0:00:04.977 ***** 12180 1727204057.56567: entering _queue_task() for managed-node1/gather_facts 12180 1727204057.56859: worker is 1 (out of 1 available) 12180 1727204057.56874: exiting _queue_task() for managed-node1/gather_facts 12180 1727204057.56884: done queuing things up, now waiting for results queue to drain 12180 1727204057.56886: waiting for pending results... 12180 1727204057.57185: running TaskExecutor() for managed-node1/TASK: Gathering Facts 12180 1727204057.57321: in run() - task 0affcd87-79f5-ccb1-55ae-000000000129 12180 1727204057.57346: variable 'ansible_search_path' from source: unknown 12180 1727204057.57385: calling self._execute() 12180 1727204057.57479: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204057.57490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204057.57501: variable 'omit' from source: magic vars 12180 1727204057.58176: variable 'ansible_distribution_major_version' from source: facts 12180 1727204057.58198: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204057.58209: variable 'omit' from source: magic vars 12180 1727204057.58862: variable 'omit' from source: magic vars 12180 1727204057.59056: variable 'omit' from source: magic vars 12180 1727204057.59104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204057.59204: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204057.59242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204057.59268: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204057.59285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204057.59323: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204057.59337: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204057.59352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204057.59577: Set connection var ansible_pipelining to False 12180 1727204057.59586: Set connection var ansible_shell_type to sh 12180 1727204057.59596: Set connection var ansible_timeout to 10 12180 1727204057.59607: Set connection var ansible_connection to ssh 12180 1727204057.59617: Set connection var ansible_shell_executable to /bin/sh 12180 1727204057.59626: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204057.59662: variable 'ansible_shell_executable' from source: unknown 12180 1727204057.59818: variable 'ansible_connection' from source: unknown 12180 1727204057.59829: variable 'ansible_module_compression' from source: unknown 12180 1727204057.59837: variable 'ansible_shell_type' from source: unknown 12180 1727204057.59844: variable 'ansible_shell_executable' from source: unknown 12180 1727204057.59850: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204057.59856: variable 'ansible_pipelining' from source: unknown 12180 1727204057.59861: variable 'ansible_timeout' from source: unknown 12180 1727204057.59875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204057.60147: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204057.60167: variable 'omit' from source: magic vars 12180 1727204057.60178: starting attempt loop 12180 1727204057.60184: running the handler 12180 1727204057.60201: variable 'ansible_facts' from source: unknown 12180 1727204057.60269: _low_level_execute_command(): starting 12180 1727204057.60369: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204057.61854: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204057.61919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204057.61938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204057.62016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204057.62067: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204057.62107: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204057.62124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204057.62145: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204057.62234: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204057.62249: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204057.62262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204057.62281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204057.62299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204057.62312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204057.62334: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204057.62351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204057.62436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204057.62510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204057.62536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204057.62648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204057.64208: stdout chunk (state=3): >>>/root <<< 12180 1727204057.64509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204057.64623: stderr chunk (state=3): >>><<< 12180 1727204057.64626: stdout chunk (state=3): >>><<< 12180 1727204057.64747: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204057.64750: _low_level_execute_command(): starting 12180 1727204057.64753: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204057.646542-12631-278463769028223 `" && echo ansible-tmp-1727204057.646542-12631-278463769028223="` echo /root/.ansible/tmp/ansible-tmp-1727204057.646542-12631-278463769028223 `" ) && sleep 0' 12180 1727204057.67951: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204057.68078: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204057.68173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204057.68197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204057.68246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204057.68259: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204057.68275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204057.68295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204057.68313: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204057.68333: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204057.68404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204057.68424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204057.68446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204057.68458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204057.68475: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204057.68490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204057.68671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204057.68695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204057.68711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204057.68802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204057.70641: stdout chunk (state=3): >>>ansible-tmp-1727204057.646542-12631-278463769028223=/root/.ansible/tmp/ansible-tmp-1727204057.646542-12631-278463769028223 <<< 12180 1727204057.70854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204057.70857: stdout chunk (state=3): >>><<< 12180 1727204057.70860: stderr chunk (state=3): >>><<< 12180 1727204057.71073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204057.646542-12631-278463769028223=/root/.ansible/tmp/ansible-tmp-1727204057.646542-12631-278463769028223 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204057.71077: variable 'ansible_module_compression' from source: unknown 12180 1727204057.71080: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12180 1727204057.71082: variable 'ansible_facts' from source: unknown 12180 1727204057.71217: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204057.646542-12631-278463769028223/AnsiballZ_setup.py 12180 1727204057.72085: Sending initial data 12180 1727204057.72088: Sent initial data (153 bytes) 12180 1727204057.75794: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204057.76071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204057.76088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204057.76107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204057.76163: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204057.76180: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204057.76196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204057.76215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204057.76230: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204057.76243: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204057.76255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204057.76273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204057.76293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204057.76307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204057.76319: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204057.76336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204057.76530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204057.76554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204057.76573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204057.76700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204057.78370: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204057.78436: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204057.78493: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpfdluus71 /root/.ansible/tmp/ansible-tmp-1727204057.646542-12631-278463769028223/AnsiballZ_setup.py <<< 12180 1727204057.78548: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204057.81601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204057.81723: stderr chunk (state=3): >>><<< 12180 1727204057.81729: stdout chunk (state=3): >>><<< 12180 1727204057.81732: done transferring module to remote 12180 1727204057.81735: _low_level_execute_command(): starting 12180 1727204057.81741: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204057.646542-12631-278463769028223/ /root/.ansible/tmp/ansible-tmp-1727204057.646542-12631-278463769028223/AnsiballZ_setup.py && sleep 0' 12180 1727204057.84533: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204057.84584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204057.84600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204057.84620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204057.84669: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204057.84790: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204057.84806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204057.84824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204057.84838: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204057.84848: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204057.84860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204057.84878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204057.84888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204057.84900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204057.84911: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204057.84926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204057.85005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204057.85390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204057.85408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204057.85502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204057.87287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204057.87320: stderr chunk (state=3): >>><<< 12180 1727204057.87323: stdout chunk (state=3): >>><<< 12180 1727204057.87423: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204057.87429: _low_level_execute_command(): starting 12180 1727204057.87433: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204057.646542-12631-278463769028223/AnsiballZ_setup.py && sleep 0' 12180 1727204057.88880: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204057.88956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204057.88976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204057.88996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204057.89043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204057.89145: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204057.89164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204057.89186: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204057.89198: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204057.89210: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204057.89222: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204057.89239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204057.89254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204057.89271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204057.89283: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204057.89298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204057.89378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204057.89483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204057.89502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204057.89601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204058.47377: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "18", "epoch": "1727204058", "epoch_int": "1727204058", "date": "2024-09-24", "time": "14:54:18", "iso8601_micro": "2024-09-24T18:54:18.175871Z", "iso8601": "2024-09-24T18:54:18Z", "iso8601_basic": "20240924T145418175871", "iso8601_basic_short": "20240924T145418", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2802, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 730, "free": 2802}, "nocache": {"free": 3265, "used": 267}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 321, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264272871424, "block_size": 4096, "block_total": 65519355, "block_available": 64519744, "block_used": 999611, "inode_total": 131071472, "inode_available": 130998250, "inode_used": 73222, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.39, "5m": 0.37, "15m": 0.18}, "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12180 1727204058.49597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204058.49657: stderr chunk (state=3): >>><<< 12180 1727204058.49662: stdout chunk (state=3): >>><<< 12180 1727204058.49882: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "622812727ab94fd6acd7dd0d437b6e90", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAL33r0sK53nK1ELEWEygFfBly+jKL3G1irB+e4OjfP+034giVSb4+qmZbsccYzFRUysDiwQ9AOkXxjXzuDmR+xqyvjg1wiGR1mtnhVEjD5QOMP3FrsA4T0YUj+99RePF5V1syZcivhL83fhGMQW2xqX2DsatToaaogZ2OB3PfjtjAAAAFQDxVD/D0jmbOX2y1lkpNUepQHopQwAAAIEAmIlCnXBlcPjUWk7TsM1FtBhlH1jHHCOTF1EkipNNB3yizhCo4XzHdZ42Etc3A12/rcZ94rFaauqCV6XrShBkQ2YBIcz9u8BOyWI/nScoq9IA/qLrhWWtjBWDyrdKnEa5YZssQtDa+FaZQkzy1TQpvFQxv5c95+TrmPFgDpw+0q0AAACBAKYOTFtEPTGzq9w6YdMspRWk65ZgXou58bQl818PvNnuZKKVReDFknfNCcabfj+HjlOg9wBCZZ+D3vopxZ4Qgevz/pLqcnLY7Kxx+xf6NhqDwcEwkHk/VYomBLrfyEZP8N81dcv36ZZUVoca5Y+2ZG2o1gC632nLGosyJBtmPmel", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzhflzK5hY1zLI6wcdpu95QnCG0TwdK/8SyDudYYHDoRycyI9pVuSUQsXpQq3jHdjCurFgtKtyr8lvo1dWvKJ9SZpQk4asejncDNODyhSR95eNQg6E1G2kN1mscOp76cjW9Muvyhcku112WdRWTEojLJL5DfJAWrsWwHJI+QuhZuKvrlsxPvfOuY5td/aGC/Ydzbjkmya2qvXXJRscQArDnYmiPiatkFESRif9MXdmIn2LqQXAcZGFUG+SWQvZR1PDWKI2U5HxvoeUf+Uh2jDO3mFDWao9+SGRC2QuO+xLJgoiKIx2L3GWLTkbKjAbsk0iedaUuh+GdmUatsU09UVZi9IYBJYjhiYuZKsYx2LNpBqu8fxh5RaBfadQzkDGVBJE45/9X+9vlSygk3zMak9yWtS9vfV+CoODJx9wA1tv3r0Veiy/Y9bbcT7DtQhiWscP2X/cF2QZtdabW+Rb+zKZomn+6upN+zZeyVRClRsqVNURxevMs+UyJTKV481ayMU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHpEZiCiKJZKK5PvXzPGl0kyJcU4P7nxoUjBffLcHt9dAB0RhjGORZ4v3/W6TdO0PAsLaKZ7WyFecLN3V9VWyiA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIK5vZWfq5/76ny3vCPOJqG/mpsIiiNwZzQWhA7bM1PFT", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "54", "second": "18", "epoch": "1727204058", "epoch_int": "1727204058", "date": "2024-09-24", "time": "14:54:18", "iso8601_micro": "2024-09-24T18:54:18.175871Z", "iso8601": "2024-09-24T18:54:18Z", "iso8601_basic": "20240924T145418175871", "iso8601_basic_short": "20240924T145418", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2802, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 730, "free": 2802}, "nocache": {"free": 3265, "used": 267}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_uuid": "ec28c5e6-50d6-5684-e735-f75357a23b08", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 321, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264272871424, "block_size": 4096, "block_total": 65519355, "block_available": 64519744, "block_used": 999611, "inode_total": 131071472, "inode_available": 130998250, "inode_used": 73222, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 42862 10.31.9.148 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 42862 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.39, "5m": 0.37, "15m": 0.18}, "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::108f:92ff:fee7:c1ab", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.148", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:8f:92:e7:c1:ab", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.148"], "ansible_all_ipv6_addresses": ["fe80::108f:92ff:fee7:c1ab"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.148", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::108f:92ff:fee7:c1ab"]}, "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204058.50112: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204057.646542-12631-278463769028223/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204058.50141: _low_level_execute_command(): starting 12180 1727204058.50152: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204057.646542-12631-278463769028223/ > /dev/null 2>&1 && sleep 0' 12180 1727204058.51020: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204058.51070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204058.51087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204058.51104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204058.51154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204058.51168: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204058.51183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204058.51201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204058.51222: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204058.51249: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204058.51270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204058.51288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204058.51305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204058.51321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204058.51336: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204058.51353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204058.51440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204058.51464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204058.51488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204058.51591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204058.54190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204058.54195: stdout chunk (state=3): >>><<< 12180 1727204058.54198: stderr chunk (state=3): >>><<< 12180 1727204058.54477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204058.54484: handler run complete 12180 1727204058.54487: variable 'ansible_facts' from source: unknown 12180 1727204058.54525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204058.54877: variable 'ansible_facts' from source: unknown 12180 1727204058.55108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204058.55388: attempt loop complete, returning result 12180 1727204058.55467: _execute() done 12180 1727204058.55481: dumping result to json 12180 1727204058.55519: done dumping result, returning 12180 1727204058.55577: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [0affcd87-79f5-ccb1-55ae-000000000129] 12180 1727204058.55587: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000129 ok: [managed-node1] 12180 1727204058.56549: no more pending results, returning what we have 12180 1727204058.56554: results queue empty 12180 1727204058.56555: checking for any_errors_fatal 12180 1727204058.56556: done checking for any_errors_fatal 12180 1727204058.56557: checking for max_fail_percentage 12180 1727204058.56558: done checking for max_fail_percentage 12180 1727204058.56559: checking to see if all hosts have failed and the running result is not ok 12180 1727204058.56560: done checking to see if all hosts have failed 12180 1727204058.56561: getting the remaining hosts for this loop 12180 1727204058.56563: done getting the remaining hosts for this loop 12180 1727204058.56568: getting the next task for host managed-node1 12180 1727204058.56576: done getting next task for host managed-node1 12180 1727204058.56578: ^ task is: TASK: meta (flush_handlers) 12180 1727204058.56580: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204058.56583: getting variables 12180 1727204058.56585: in VariableManager get_vars() 12180 1727204058.56622: Calling all_inventory to load vars for managed-node1 12180 1727204058.56625: Calling groups_inventory to load vars for managed-node1 12180 1727204058.56630: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204058.56642: Calling all_plugins_play to load vars for managed-node1 12180 1727204058.56645: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204058.56649: Calling groups_plugins_play to load vars for managed-node1 12180 1727204058.56812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204058.57121: done with get_vars() 12180 1727204058.57136: done getting variables 12180 1727204058.57351: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000129 12180 1727204058.57355: WORKER PROCESS EXITING 12180 1727204058.57405: in VariableManager get_vars() 12180 1727204058.57422: Calling all_inventory to load vars for managed-node1 12180 1727204058.57425: Calling groups_inventory to load vars for managed-node1 12180 1727204058.57429: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204058.57434: Calling all_plugins_play to load vars for managed-node1 12180 1727204058.57436: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204058.57443: Calling groups_plugins_play to load vars for managed-node1 12180 1727204058.57657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204058.58031: done with get_vars() 12180 1727204058.58045: done queuing things up, now waiting for results queue to drain 12180 1727204058.58046: results queue empty 12180 1727204058.58047: checking for any_errors_fatal 12180 1727204058.58050: done checking for any_errors_fatal 12180 1727204058.58051: checking for max_fail_percentage 12180 1727204058.58052: done checking for max_fail_percentage 12180 1727204058.58053: checking to see if all hosts have failed and the running result is not ok 12180 1727204058.58053: done checking to see if all hosts have failed 12180 1727204058.58054: getting the remaining hosts for this loop 12180 1727204058.58055: done getting the remaining hosts for this loop 12180 1727204058.58058: getting the next task for host managed-node1 12180 1727204058.58068: done getting next task for host managed-node1 12180 1727204058.58071: ^ task is: TASK: INIT Prepare setup 12180 1727204058.58073: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204058.58076: getting variables 12180 1727204058.58077: in VariableManager get_vars() 12180 1727204058.58092: Calling all_inventory to load vars for managed-node1 12180 1727204058.58094: Calling groups_inventory to load vars for managed-node1 12180 1727204058.58096: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204058.58108: Calling all_plugins_play to load vars for managed-node1 12180 1727204058.58111: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204058.58114: Calling groups_plugins_play to load vars for managed-node1 12180 1727204058.58259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204058.58474: done with get_vars() 12180 1727204058.58482: done getting variables 12180 1727204058.58561: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:15 Tuesday 24 September 2024 14:54:18 -0400 (0:00:01.020) 0:00:05.997 ***** 12180 1727204058.58588: entering _queue_task() for managed-node1/debug 12180 1727204058.58590: Creating lock for debug 12180 1727204058.59171: worker is 1 (out of 1 available) 12180 1727204058.59182: exiting _queue_task() for managed-node1/debug 12180 1727204058.59198: done queuing things up, now waiting for results queue to drain 12180 1727204058.59200: waiting for pending results... 12180 1727204058.59463: running TaskExecutor() for managed-node1/TASK: INIT Prepare setup 12180 1727204058.59581: in run() - task 0affcd87-79f5-ccb1-55ae-00000000000b 12180 1727204058.59607: variable 'ansible_search_path' from source: unknown 12180 1727204058.59699: calling self._execute() 12180 1727204058.59870: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204058.59887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204058.59904: variable 'omit' from source: magic vars 12180 1727204058.60357: variable 'ansible_distribution_major_version' from source: facts 12180 1727204058.60387: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204058.60415: variable 'omit' from source: magic vars 12180 1727204058.60447: variable 'omit' from source: magic vars 12180 1727204058.60493: variable 'omit' from source: magic vars 12180 1727204058.60553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204058.60622: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204058.60660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204058.60694: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204058.60719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204058.60774: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204058.60785: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204058.60794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204058.60999: Set connection var ansible_pipelining to False 12180 1727204058.61016: Set connection var ansible_shell_type to sh 12180 1727204058.61038: Set connection var ansible_timeout to 10 12180 1727204058.61069: Set connection var ansible_connection to ssh 12180 1727204058.61089: Set connection var ansible_shell_executable to /bin/sh 12180 1727204058.61108: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204058.61157: variable 'ansible_shell_executable' from source: unknown 12180 1727204058.61174: variable 'ansible_connection' from source: unknown 12180 1727204058.61184: variable 'ansible_module_compression' from source: unknown 12180 1727204058.61191: variable 'ansible_shell_type' from source: unknown 12180 1727204058.61197: variable 'ansible_shell_executable' from source: unknown 12180 1727204058.61204: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204058.61213: variable 'ansible_pipelining' from source: unknown 12180 1727204058.61219: variable 'ansible_timeout' from source: unknown 12180 1727204058.61229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204058.61404: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204058.61431: variable 'omit' from source: magic vars 12180 1727204058.61452: starting attempt loop 12180 1727204058.61468: running the handler 12180 1727204058.61545: handler run complete 12180 1727204058.61581: attempt loop complete, returning result 12180 1727204058.61588: _execute() done 12180 1727204058.61595: dumping result to json 12180 1727204058.61604: done dumping result, returning 12180 1727204058.61620: done running TaskExecutor() for managed-node1/TASK: INIT Prepare setup [0affcd87-79f5-ccb1-55ae-00000000000b] 12180 1727204058.61633: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000000b ok: [managed-node1] => {} MSG: ################################################## 12180 1727204058.61790: no more pending results, returning what we have 12180 1727204058.61794: results queue empty 12180 1727204058.61795: checking for any_errors_fatal 12180 1727204058.61797: done checking for any_errors_fatal 12180 1727204058.61798: checking for max_fail_percentage 12180 1727204058.61800: done checking for max_fail_percentage 12180 1727204058.61801: checking to see if all hosts have failed and the running result is not ok 12180 1727204058.61802: done checking to see if all hosts have failed 12180 1727204058.61803: getting the remaining hosts for this loop 12180 1727204058.61805: done getting the remaining hosts for this loop 12180 1727204058.61808: getting the next task for host managed-node1 12180 1727204058.61816: done getting next task for host managed-node1 12180 1727204058.61819: ^ task is: TASK: Install dnsmasq 12180 1727204058.61822: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204058.61829: getting variables 12180 1727204058.61832: in VariableManager get_vars() 12180 1727204058.61886: Calling all_inventory to load vars for managed-node1 12180 1727204058.61889: Calling groups_inventory to load vars for managed-node1 12180 1727204058.61891: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204058.61903: Calling all_plugins_play to load vars for managed-node1 12180 1727204058.61905: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204058.61908: Calling groups_plugins_play to load vars for managed-node1 12180 1727204058.62863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204058.63519: done with get_vars() 12180 1727204058.63534: done getting variables 12180 1727204058.63924: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000000b 12180 1727204058.63927: WORKER PROCESS EXITING 12180 1727204058.63959: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.054) 0:00:06.052 ***** 12180 1727204058.64011: entering _queue_task() for managed-node1/package 12180 1727204058.65093: worker is 1 (out of 1 available) 12180 1727204058.65105: exiting _queue_task() for managed-node1/package 12180 1727204058.65118: done queuing things up, now waiting for results queue to drain 12180 1727204058.65120: waiting for pending results... 12180 1727204058.65600: running TaskExecutor() for managed-node1/TASK: Install dnsmasq 12180 1727204058.65789: in run() - task 0affcd87-79f5-ccb1-55ae-00000000000f 12180 1727204058.65858: variable 'ansible_search_path' from source: unknown 12180 1727204058.65938: variable 'ansible_search_path' from source: unknown 12180 1727204058.65990: calling self._execute() 12180 1727204058.66196: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204058.66207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204058.66221: variable 'omit' from source: magic vars 12180 1727204058.68985: variable 'ansible_distribution_major_version' from source: facts 12180 1727204058.69011: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204058.69023: variable 'omit' from source: magic vars 12180 1727204058.69199: variable 'omit' from source: magic vars 12180 1727204058.69596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204058.78736: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204058.78901: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204058.79114: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204058.79163: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204058.79276: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204058.79489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204058.79520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204058.79549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204058.79618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204058.79700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204058.80037: variable '__network_is_ostree' from source: set_fact 12180 1727204058.80049: variable 'omit' from source: magic vars 12180 1727204058.80089: variable 'omit' from source: magic vars 12180 1727204058.80152: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204058.80259: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204058.80286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204058.80358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204058.80378: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204058.80480: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204058.80490: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204058.80497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204058.80720: Set connection var ansible_pipelining to False 12180 1727204058.80732: Set connection var ansible_shell_type to sh 12180 1727204058.80743: Set connection var ansible_timeout to 10 12180 1727204058.80783: Set connection var ansible_connection to ssh 12180 1727204058.80794: Set connection var ansible_shell_executable to /bin/sh 12180 1727204058.80893: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204058.80925: variable 'ansible_shell_executable' from source: unknown 12180 1727204058.80935: variable 'ansible_connection' from source: unknown 12180 1727204058.80942: variable 'ansible_module_compression' from source: unknown 12180 1727204058.80948: variable 'ansible_shell_type' from source: unknown 12180 1727204058.80953: variable 'ansible_shell_executable' from source: unknown 12180 1727204058.80958: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204058.80967: variable 'ansible_pipelining' from source: unknown 12180 1727204058.80973: variable 'ansible_timeout' from source: unknown 12180 1727204058.80980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204058.81094: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204058.81343: variable 'omit' from source: magic vars 12180 1727204058.81353: starting attempt loop 12180 1727204058.81359: running the handler 12180 1727204058.81372: variable 'ansible_facts' from source: unknown 12180 1727204058.81379: variable 'ansible_facts' from source: unknown 12180 1727204058.81542: _low_level_execute_command(): starting 12180 1727204058.81560: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204058.83640: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204058.83659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204058.83677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204058.83697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204058.83753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204058.83767: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204058.83782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204058.83798: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204058.83834: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204058.83846: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204058.83863: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204058.83880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204058.83898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204058.83911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204058.83941: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204058.83955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204058.84042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204058.84173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204058.84196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204058.84326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204058.86532: stdout chunk (state=3): >>>/root <<< 12180 1727204058.86778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204058.86782: stdout chunk (state=3): >>><<< 12180 1727204058.86784: stderr chunk (state=3): >>><<< 12180 1727204058.86896: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204058.86900: _low_level_execute_command(): starting 12180 1727204058.86905: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204058.8680298-12790-110324518470110 `" && echo ansible-tmp-1727204058.8680298-12790-110324518470110="` echo /root/.ansible/tmp/ansible-tmp-1727204058.8680298-12790-110324518470110 `" ) && sleep 0' 12180 1727204058.89399: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204058.89418: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204058.89437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204058.89456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204058.89515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204058.89532: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204058.89549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204058.89566: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204058.89578: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204058.89589: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204058.89600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204058.89619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204058.89638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204058.89653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204058.89669: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204058.89683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204058.89763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204058.89808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204058.89825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204058.89923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204058.91798: stdout chunk (state=3): >>>ansible-tmp-1727204058.8680298-12790-110324518470110=/root/.ansible/tmp/ansible-tmp-1727204058.8680298-12790-110324518470110 <<< 12180 1727204058.92012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204058.92016: stdout chunk (state=3): >>><<< 12180 1727204058.92018: stderr chunk (state=3): >>><<< 12180 1727204058.92274: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204058.8680298-12790-110324518470110=/root/.ansible/tmp/ansible-tmp-1727204058.8680298-12790-110324518470110 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204058.92278: variable 'ansible_module_compression' from source: unknown 12180 1727204058.92282: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 12180 1727204058.92285: ANSIBALLZ: Acquiring lock 12180 1727204058.92287: ANSIBALLZ: Lock acquired: 140650305861680 12180 1727204058.92289: ANSIBALLZ: Creating module 12180 1727204059.30501: ANSIBALLZ: Writing module into payload 12180 1727204059.31104: ANSIBALLZ: Writing module 12180 1727204059.31181: ANSIBALLZ: Renaming module 12180 1727204059.31237: ANSIBALLZ: Done creating module 12180 1727204059.31268: variable 'ansible_facts' from source: unknown 12180 1727204059.31398: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204058.8680298-12790-110324518470110/AnsiballZ_dnf.py 12180 1727204059.31726: Sending initial data 12180 1727204059.31734: Sent initial data (152 bytes) 12180 1727204059.33254: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204059.33258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204059.33295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204059.33299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204059.33302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 12180 1727204059.33310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204059.33380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204059.33398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204059.33491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204059.36051: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204059.36106: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204059.36149: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmphkwrpc6n /root/.ansible/tmp/ansible-tmp-1727204058.8680298-12790-110324518470110/AnsiballZ_dnf.py <<< 12180 1727204059.36199: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204059.37875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204059.38329: stderr chunk (state=3): >>><<< 12180 1727204059.38351: stdout chunk (state=3): >>><<< 12180 1727204059.38370: done transferring module to remote 12180 1727204059.38381: _low_level_execute_command(): starting 12180 1727204059.38407: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204058.8680298-12790-110324518470110/ /root/.ansible/tmp/ansible-tmp-1727204058.8680298-12790-110324518470110/AnsiballZ_dnf.py && sleep 0' 12180 1727204059.40177: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204059.40221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204059.40244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204059.40283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204059.40722: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204059.40754: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204059.40777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204059.40791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204059.40849: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204059.40882: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204059.40886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204059.40888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204059.40972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204059.40977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204059.40995: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204059.41000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204059.41210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204059.41258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204059.41279: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204059.41470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204059.44284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204059.44323: stderr chunk (state=3): >>><<< 12180 1727204059.44329: stdout chunk (state=3): >>><<< 12180 1727204059.44344: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204059.44347: _low_level_execute_command(): starting 12180 1727204059.44351: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204058.8680298-12790-110324518470110/AnsiballZ_dnf.py && sleep 0' 12180 1727204059.45140: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204059.45149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204059.45159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204059.45176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204059.45243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204059.45250: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204059.45260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204059.45282: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204059.45289: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204059.45298: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204059.45312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204059.45334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204059.45346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204059.45354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204059.45361: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204059.45373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204059.45496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204059.45514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204059.45829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204059.45937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204062.14852: stdout chunk (state=3): >>> {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.85-16.el9.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 12180 1727204062.20262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204062.20269: stdout chunk (state=3): >>><<< 12180 1727204062.20274: stderr chunk (state=3): >>><<< 12180 1727204062.20296: _low_level_execute_command() done: rc=0, stdout= {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.85-16.el9.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204062.20338: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204058.8680298-12790-110324518470110/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204062.20346: _low_level_execute_command(): starting 12180 1727204062.20351: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204058.8680298-12790-110324518470110/ > /dev/null 2>&1 && sleep 0' 12180 1727204062.22175: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204062.22186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204062.22195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204062.22210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204062.22255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204062.22679: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204062.22689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204062.22703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204062.22711: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204062.22718: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204062.22726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204062.22739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204062.22751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204062.22759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204062.22768: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204062.22777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204062.22854: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204062.22877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204062.22890: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204062.22979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204062.24912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204062.24916: stdout chunk (state=3): >>><<< 12180 1727204062.24919: stderr chunk (state=3): >>><<< 12180 1727204062.24942: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204062.24946: handler run complete 12180 1727204062.25110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204062.25293: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204062.25330: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204062.25363: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204062.25393: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204062.25470: variable '__install_status' from source: unknown 12180 1727204062.25486: Evaluated conditional (__install_status is success): True 12180 1727204062.25502: attempt loop complete, returning result 12180 1727204062.25505: _execute() done 12180 1727204062.25507: dumping result to json 12180 1727204062.25514: done dumping result, returning 12180 1727204062.25522: done running TaskExecutor() for managed-node1/TASK: Install dnsmasq [0affcd87-79f5-ccb1-55ae-00000000000f] 12180 1727204062.25529: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000000f changed: [managed-node1] => { "attempts": 1, "changed": true, "rc": 0, "results": [ "Installed: dnsmasq-2.85-16.el9.x86_64" ] } 12180 1727204062.25733: no more pending results, returning what we have 12180 1727204062.25737: results queue empty 12180 1727204062.25738: checking for any_errors_fatal 12180 1727204062.25745: done checking for any_errors_fatal 12180 1727204062.25746: checking for max_fail_percentage 12180 1727204062.25749: done checking for max_fail_percentage 12180 1727204062.25750: checking to see if all hosts have failed and the running result is not ok 12180 1727204062.25752: done checking to see if all hosts have failed 12180 1727204062.25752: getting the remaining hosts for this loop 12180 1727204062.25754: done getting the remaining hosts for this loop 12180 1727204062.25758: getting the next task for host managed-node1 12180 1727204062.25767: done getting next task for host managed-node1 12180 1727204062.25770: ^ task is: TASK: Install pgrep, sysctl 12180 1727204062.25773: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204062.25776: getting variables 12180 1727204062.25778: in VariableManager get_vars() 12180 1727204062.25819: Calling all_inventory to load vars for managed-node1 12180 1727204062.25822: Calling groups_inventory to load vars for managed-node1 12180 1727204062.25824: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204062.25834: Calling all_plugins_play to load vars for managed-node1 12180 1727204062.25836: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204062.25838: Calling groups_plugins_play to load vars for managed-node1 12180 1727204062.25977: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000000f 12180 1727204062.25980: WORKER PROCESS EXITING 12180 1727204062.26000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204062.26203: done with get_vars() 12180 1727204062.26216: done getting variables 12180 1727204062.26278: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Tuesday 24 September 2024 14:54:22 -0400 (0:00:03.624) 0:00:09.676 ***** 12180 1727204062.26425: entering _queue_task() for managed-node1/package 12180 1727204062.26831: worker is 1 (out of 1 available) 12180 1727204062.26956: exiting _queue_task() for managed-node1/package 12180 1727204062.26970: done queuing things up, now waiting for results queue to drain 12180 1727204062.26972: waiting for pending results... 12180 1727204062.27907: running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl 12180 1727204062.28015: in run() - task 0affcd87-79f5-ccb1-55ae-000000000010 12180 1727204062.28027: variable 'ansible_search_path' from source: unknown 12180 1727204062.28030: variable 'ansible_search_path' from source: unknown 12180 1727204062.28068: calling self._execute() 12180 1727204062.29558: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204062.29564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204062.29571: variable 'omit' from source: magic vars 12180 1727204062.29927: variable 'ansible_distribution_major_version' from source: facts 12180 1727204062.29943: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204062.30675: variable 'ansible_os_family' from source: facts 12180 1727204062.30681: Evaluated conditional (ansible_os_family == 'RedHat'): True 12180 1727204062.32199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204062.32872: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204062.32920: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204062.32954: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204062.33391: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204062.33487: variable 'ansible_distribution_major_version' from source: facts 12180 1727204062.33785: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 12180 1727204062.33793: when evaluation is False, skipping this task 12180 1727204062.33799: _execute() done 12180 1727204062.33805: dumping result to json 12180 1727204062.33811: done dumping result, returning 12180 1727204062.33822: done running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl [0affcd87-79f5-ccb1-55ae-000000000010] 12180 1727204062.33834: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000010 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 12180 1727204062.33988: no more pending results, returning what we have 12180 1727204062.33992: results queue empty 12180 1727204062.33993: checking for any_errors_fatal 12180 1727204062.33999: done checking for any_errors_fatal 12180 1727204062.34000: checking for max_fail_percentage 12180 1727204062.34001: done checking for max_fail_percentage 12180 1727204062.34002: checking to see if all hosts have failed and the running result is not ok 12180 1727204062.34003: done checking to see if all hosts have failed 12180 1727204062.34003: getting the remaining hosts for this loop 12180 1727204062.34005: done getting the remaining hosts for this loop 12180 1727204062.34010: getting the next task for host managed-node1 12180 1727204062.34016: done getting next task for host managed-node1 12180 1727204062.34019: ^ task is: TASK: Install pgrep, sysctl 12180 1727204062.34021: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204062.34024: getting variables 12180 1727204062.34026: in VariableManager get_vars() 12180 1727204062.34070: Calling all_inventory to load vars for managed-node1 12180 1727204062.34073: Calling groups_inventory to load vars for managed-node1 12180 1727204062.34076: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204062.34087: Calling all_plugins_play to load vars for managed-node1 12180 1727204062.34089: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204062.34092: Calling groups_plugins_play to load vars for managed-node1 12180 1727204062.34450: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000010 12180 1727204062.34453: WORKER PROCESS EXITING 12180 1727204062.34879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204062.35056: done with get_vars() 12180 1727204062.35068: done getting variables 12180 1727204062.35124: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.087) 0:00:09.763 ***** 12180 1727204062.35157: entering _queue_task() for managed-node1/package 12180 1727204062.36729: worker is 1 (out of 1 available) 12180 1727204062.36742: exiting _queue_task() for managed-node1/package 12180 1727204062.36756: done queuing things up, now waiting for results queue to drain 12180 1727204062.36757: waiting for pending results... 12180 1727204062.37753: running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl 12180 1727204062.38498: in run() - task 0affcd87-79f5-ccb1-55ae-000000000011 12180 1727204062.39031: variable 'ansible_search_path' from source: unknown 12180 1727204062.39041: variable 'ansible_search_path' from source: unknown 12180 1727204062.39084: calling self._execute() 12180 1727204062.39174: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204062.39187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204062.39201: variable 'omit' from source: magic vars 12180 1727204062.39577: variable 'ansible_distribution_major_version' from source: facts 12180 1727204062.40121: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204062.40250: variable 'ansible_os_family' from source: facts 12180 1727204062.40262: Evaluated conditional (ansible_os_family == 'RedHat'): True 12180 1727204062.40442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204062.40916: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204062.41514: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204062.41557: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204062.41596: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204062.41692: variable 'ansible_distribution_major_version' from source: facts 12180 1727204062.41710: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 12180 1727204062.41720: variable 'omit' from source: magic vars 12180 1727204062.41777: variable 'omit' from source: magic vars 12180 1727204062.42345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204062.48025: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204062.48149: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204062.48195: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204062.48348: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204062.48382: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204062.48485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204062.48524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204062.48560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204062.48610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204062.48636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204062.48750: variable '__network_is_ostree' from source: set_fact 12180 1727204062.48761: variable 'omit' from source: magic vars 12180 1727204062.48799: variable 'omit' from source: magic vars 12180 1727204062.48838: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204062.48873: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204062.48900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204062.48966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204062.48982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204062.49061: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204062.49073: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204062.49081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204062.49195: Set connection var ansible_pipelining to False 12180 1727204062.49204: Set connection var ansible_shell_type to sh 12180 1727204062.49217: Set connection var ansible_timeout to 10 12180 1727204062.49226: Set connection var ansible_connection to ssh 12180 1727204062.49241: Set connection var ansible_shell_executable to /bin/sh 12180 1727204062.49253: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204062.49293: variable 'ansible_shell_executable' from source: unknown 12180 1727204062.49301: variable 'ansible_connection' from source: unknown 12180 1727204062.49308: variable 'ansible_module_compression' from source: unknown 12180 1727204062.49315: variable 'ansible_shell_type' from source: unknown 12180 1727204062.49321: variable 'ansible_shell_executable' from source: unknown 12180 1727204062.49330: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204062.49339: variable 'ansible_pipelining' from source: unknown 12180 1727204062.49345: variable 'ansible_timeout' from source: unknown 12180 1727204062.49352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204062.49461: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204062.49481: variable 'omit' from source: magic vars 12180 1727204062.49496: starting attempt loop 12180 1727204062.49505: running the handler 12180 1727204062.49515: variable 'ansible_facts' from source: unknown 12180 1727204062.49521: variable 'ansible_facts' from source: unknown 12180 1727204062.49562: _low_level_execute_command(): starting 12180 1727204062.49608: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204062.50637: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204062.50656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204062.50673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204062.50696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204062.50743: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204062.50756: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204062.50774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204062.50792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204062.50817: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204062.50831: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204062.50843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204062.50858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204062.50879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204062.50891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204062.50903: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204062.50921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204062.51067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204062.51159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204062.51180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204062.51272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204062.52848: stdout chunk (state=3): >>>/root <<< 12180 1727204062.53044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204062.53048: stdout chunk (state=3): >>><<< 12180 1727204062.53051: stderr chunk (state=3): >>><<< 12180 1727204062.53170: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204062.53174: _low_level_execute_command(): starting 12180 1727204062.53178: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204062.5307686-13140-139860896197429 `" && echo ansible-tmp-1727204062.5307686-13140-139860896197429="` echo /root/.ansible/tmp/ansible-tmp-1727204062.5307686-13140-139860896197429 `" ) && sleep 0' 12180 1727204062.54868: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204062.54888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204062.54908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204062.54930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204062.54978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204062.54992: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204062.55006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204062.55031: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204062.55044: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204062.55056: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204062.55071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204062.55085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204062.55101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204062.55114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204062.55131: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204062.55146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204062.55224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204062.55370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204062.55388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204062.56385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204062.57334: stdout chunk (state=3): >>>ansible-tmp-1727204062.5307686-13140-139860896197429=/root/.ansible/tmp/ansible-tmp-1727204062.5307686-13140-139860896197429 <<< 12180 1727204062.57539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204062.57543: stdout chunk (state=3): >>><<< 12180 1727204062.57546: stderr chunk (state=3): >>><<< 12180 1727204062.57570: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204062.5307686-13140-139860896197429=/root/.ansible/tmp/ansible-tmp-1727204062.5307686-13140-139860896197429 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204062.57672: variable 'ansible_module_compression' from source: unknown 12180 1727204062.57770: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 12180 1727204062.57773: variable 'ansible_facts' from source: unknown 12180 1727204062.57832: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204062.5307686-13140-139860896197429/AnsiballZ_dnf.py 12180 1727204062.58606: Sending initial data 12180 1727204062.58610: Sent initial data (152 bytes) 12180 1727204062.61111: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204062.61177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204062.61187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204062.61202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204062.61311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204062.61318: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204062.61330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204062.61346: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204062.61353: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204062.61360: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204062.61370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204062.61381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204062.61396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204062.61403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204062.61410: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204062.61419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204062.61496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204062.61624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204062.61631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204062.61715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204062.63473: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204062.63525: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204062.63582: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpzri3le1j /root/.ansible/tmp/ansible-tmp-1727204062.5307686-13140-139860896197429/AnsiballZ_dnf.py <<< 12180 1727204062.63630: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204062.65597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204062.65737: stderr chunk (state=3): >>><<< 12180 1727204062.65741: stdout chunk (state=3): >>><<< 12180 1727204062.65743: done transferring module to remote 12180 1727204062.65749: _low_level_execute_command(): starting 12180 1727204062.65751: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204062.5307686-13140-139860896197429/ /root/.ansible/tmp/ansible-tmp-1727204062.5307686-13140-139860896197429/AnsiballZ_dnf.py && sleep 0' 12180 1727204062.67399: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204062.67405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204062.67411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204062.67429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204062.67470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204062.67476: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204062.67492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204062.67506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204062.67855: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204062.67863: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204062.67867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204062.67874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204062.67877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204062.67879: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204062.67880: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204062.67882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204062.67884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204062.67885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204062.67887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204062.68073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204062.69788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204062.69859: stderr chunk (state=3): >>><<< 12180 1727204062.69863: stdout chunk (state=3): >>><<< 12180 1727204062.69966: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204062.69971: _low_level_execute_command(): starting 12180 1727204062.69974: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204062.5307686-13140-139860896197429/AnsiballZ_dnf.py && sleep 0' 12180 1727204062.71780: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204062.71985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204062.71999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204062.72018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204062.72067: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204062.72147: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204062.72166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204062.72192: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204062.72205: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204062.72216: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204062.72230: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204062.72248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204062.72268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204062.72281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204062.72292: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204062.72304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204062.72615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204062.72634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204062.72648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204062.72748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204063.63976: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 12180 1727204063.68269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204063.68274: stdout chunk (state=3): >>><<< 12180 1727204063.68276: stderr chunk (state=3): >>><<< 12180 1727204063.68431: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204063.68435: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204062.5307686-13140-139860896197429/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204063.68438: _low_level_execute_command(): starting 12180 1727204063.68440: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204062.5307686-13140-139860896197429/ > /dev/null 2>&1 && sleep 0' 12180 1727204063.73055: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204063.73183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204063.73194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204063.73257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204063.73523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204063.73541: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204063.73557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204063.73579: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204063.73592: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204063.73603: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204063.73622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204063.73640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204063.73683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204063.73697: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204063.73709: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204063.73726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204063.74203: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204063.74221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204063.74238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204063.74331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204063.76242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204063.76249: stdout chunk (state=3): >>><<< 12180 1727204063.76255: stderr chunk (state=3): >>><<< 12180 1727204063.76474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204063.76482: handler run complete 12180 1727204063.76484: attempt loop complete, returning result 12180 1727204063.76486: _execute() done 12180 1727204063.76489: dumping result to json 12180 1727204063.76491: done dumping result, returning 12180 1727204063.76493: done running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl [0affcd87-79f5-ccb1-55ae-000000000011] 12180 1727204063.76496: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000011 12180 1727204063.76580: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000011 12180 1727204063.76584: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 12180 1727204063.76670: no more pending results, returning what we have 12180 1727204063.76673: results queue empty 12180 1727204063.76675: checking for any_errors_fatal 12180 1727204063.76683: done checking for any_errors_fatal 12180 1727204063.76684: checking for max_fail_percentage 12180 1727204063.76685: done checking for max_fail_percentage 12180 1727204063.76686: checking to see if all hosts have failed and the running result is not ok 12180 1727204063.76687: done checking to see if all hosts have failed 12180 1727204063.76688: getting the remaining hosts for this loop 12180 1727204063.76690: done getting the remaining hosts for this loop 12180 1727204063.76695: getting the next task for host managed-node1 12180 1727204063.76703: done getting next task for host managed-node1 12180 1727204063.76705: ^ task is: TASK: Create test interfaces 12180 1727204063.76709: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204063.76712: getting variables 12180 1727204063.76715: in VariableManager get_vars() 12180 1727204063.76762: Calling all_inventory to load vars for managed-node1 12180 1727204063.76771: Calling groups_inventory to load vars for managed-node1 12180 1727204063.76774: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204063.76785: Calling all_plugins_play to load vars for managed-node1 12180 1727204063.76788: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204063.76791: Calling groups_plugins_play to load vars for managed-node1 12180 1727204063.77079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204063.77460: done with get_vars() 12180 1727204063.77605: done getting variables 12180 1727204063.77844: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Tuesday 24 September 2024 14:54:23 -0400 (0:00:01.428) 0:00:11.191 ***** 12180 1727204063.77993: entering _queue_task() for managed-node1/shell 12180 1727204063.77995: Creating lock for shell 12180 1727204063.79868: worker is 1 (out of 1 available) 12180 1727204063.79880: exiting _queue_task() for managed-node1/shell 12180 1727204063.79892: done queuing things up, now waiting for results queue to drain 12180 1727204063.79894: waiting for pending results... 12180 1727204063.80245: running TaskExecutor() for managed-node1/TASK: Create test interfaces 12180 1727204063.80434: in run() - task 0affcd87-79f5-ccb1-55ae-000000000012 12180 1727204063.80617: variable 'ansible_search_path' from source: unknown 12180 1727204063.80626: variable 'ansible_search_path' from source: unknown 12180 1727204063.80780: calling self._execute() 12180 1727204063.80930: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204063.80980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204063.80998: variable 'omit' from source: magic vars 12180 1727204063.81683: variable 'ansible_distribution_major_version' from source: facts 12180 1727204063.81766: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204063.81816: variable 'omit' from source: magic vars 12180 1727204063.81878: variable 'omit' from source: magic vars 12180 1727204063.82849: variable 'dhcp_interface1' from source: play vars 12180 1727204063.82863: variable 'dhcp_interface2' from source: play vars 12180 1727204063.82910: variable 'omit' from source: magic vars 12180 1727204063.82970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204063.83098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204063.83128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204063.83177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204063.83321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204063.83381: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204063.83451: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204063.83461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204063.83627: Set connection var ansible_pipelining to False 12180 1727204063.83643: Set connection var ansible_shell_type to sh 12180 1727204063.83654: Set connection var ansible_timeout to 10 12180 1727204063.83668: Set connection var ansible_connection to ssh 12180 1727204063.83680: Set connection var ansible_shell_executable to /bin/sh 12180 1727204063.83691: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204063.83726: variable 'ansible_shell_executable' from source: unknown 12180 1727204063.83734: variable 'ansible_connection' from source: unknown 12180 1727204063.83750: variable 'ansible_module_compression' from source: unknown 12180 1727204063.83757: variable 'ansible_shell_type' from source: unknown 12180 1727204063.83766: variable 'ansible_shell_executable' from source: unknown 12180 1727204063.83774: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204063.83782: variable 'ansible_pipelining' from source: unknown 12180 1727204063.83789: variable 'ansible_timeout' from source: unknown 12180 1727204063.83798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204063.83945: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204063.84089: variable 'omit' from source: magic vars 12180 1727204063.84100: starting attempt loop 12180 1727204063.84107: running the handler 12180 1727204063.84129: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204063.84184: _low_level_execute_command(): starting 12180 1727204063.84209: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204063.87281: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204063.87287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204063.87304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204063.87359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204063.88401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204063.88429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204063.88521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204063.90079: stdout chunk (state=3): >>>/root <<< 12180 1727204063.90277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204063.90281: stdout chunk (state=3): >>><<< 12180 1727204063.90284: stderr chunk (state=3): >>><<< 12180 1727204063.90404: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204063.90409: _low_level_execute_command(): starting 12180 1727204063.90412: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204063.903061-13513-200320331148593 `" && echo ansible-tmp-1727204063.903061-13513-200320331148593="` echo /root/.ansible/tmp/ansible-tmp-1727204063.903061-13513-200320331148593 `" ) && sleep 0' 12180 1727204063.91997: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204063.92001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204063.92090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204063.92147: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204063.92151: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204063.92457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204063.92461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204063.92635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204063.92650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204063.92654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204063.92821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204063.94625: stdout chunk (state=3): >>>ansible-tmp-1727204063.903061-13513-200320331148593=/root/.ansible/tmp/ansible-tmp-1727204063.903061-13513-200320331148593 <<< 12180 1727204063.94843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204063.94847: stdout chunk (state=3): >>><<< 12180 1727204063.94854: stderr chunk (state=3): >>><<< 12180 1727204063.95025: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204063.903061-13513-200320331148593=/root/.ansible/tmp/ansible-tmp-1727204063.903061-13513-200320331148593 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204063.95055: variable 'ansible_module_compression' from source: unknown 12180 1727204063.95118: ANSIBALLZ: Using generic lock for ansible.legacy.command 12180 1727204063.95121: ANSIBALLZ: Acquiring lock 12180 1727204063.95124: ANSIBALLZ: Lock acquired: 140650305861680 12180 1727204063.95126: ANSIBALLZ: Creating module 12180 1727204064.29277: ANSIBALLZ: Writing module into payload 12180 1727204064.29391: ANSIBALLZ: Writing module 12180 1727204064.29414: ANSIBALLZ: Renaming module 12180 1727204064.29426: ANSIBALLZ: Done creating module 12180 1727204064.29442: variable 'ansible_facts' from source: unknown 12180 1727204064.29582: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204063.903061-13513-200320331148593/AnsiballZ_command.py 12180 1727204064.29739: Sending initial data 12180 1727204064.29743: Sent initial data (155 bytes) 12180 1727204064.30844: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204064.30848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204064.30859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204064.30889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204064.30930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204064.30934: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204064.30945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204064.30958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204064.31046: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204064.31049: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204064.31052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204064.31055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204064.31057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204064.31060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204064.31062: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204064.31067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204064.31145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204064.31161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204064.31172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204064.31305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204064.33138: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204064.33195: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204064.33248: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmp3f8h7ujf /root/.ansible/tmp/ansible-tmp-1727204063.903061-13513-200320331148593/AnsiballZ_command.py <<< 12180 1727204064.33300: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204064.34711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204064.34807: stderr chunk (state=3): >>><<< 12180 1727204064.34811: stdout chunk (state=3): >>><<< 12180 1727204064.34834: done transferring module to remote 12180 1727204064.34843: _low_level_execute_command(): starting 12180 1727204064.34848: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204063.903061-13513-200320331148593/ /root/.ansible/tmp/ansible-tmp-1727204063.903061-13513-200320331148593/AnsiballZ_command.py && sleep 0' 12180 1727204064.36490: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204064.36498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204064.36509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204064.36522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204064.36629: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204064.36642: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204064.36706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204064.36718: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204064.36814: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204064.36827: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204064.36835: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204064.36845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204064.36916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204064.36933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204064.36936: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204064.36947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204064.37067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204064.37240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204064.37263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204064.37432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204064.39190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204064.39288: stderr chunk (state=3): >>><<< 12180 1727204064.39292: stdout chunk (state=3): >>><<< 12180 1727204064.39295: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204064.39297: _low_level_execute_command(): starting 12180 1727204064.39301: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204063.903061-13513-200320331148593/AnsiballZ_command.py && sleep 0' 12180 1727204064.41283: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204064.41591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204064.41608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204064.41630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204064.41708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204064.41743: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204064.41757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204064.41780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204064.41804: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204064.41816: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204064.41843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204064.41857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204064.41874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204064.41902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204064.41938: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204064.41954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204064.42154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204064.42255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204064.42275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204064.42377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204065.96401: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 619 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 619 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/<<< 12180 1727204065.96423: stdout chunk (state=3): >>>show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:54:24.553983", "end": "2024-09-24 14:54:25.962871", "delta": "0:00:01.408888", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12180 1727204065.97848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204065.97852: stdout chunk (state=3): >>><<< 12180 1727204065.97854: stderr chunk (state=3): >>><<< 12180 1727204065.98038: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 619 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 619 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:54:24.553983", "end": "2024-09-24 14:54:25.962871", "delta": "0:00:01.408888", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204065.98046: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204063.903061-13513-200320331148593/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204065.98049: _low_level_execute_command(): starting 12180 1727204065.98051: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204063.903061-13513-200320331148593/ > /dev/null 2>&1 && sleep 0' 12180 1727204065.99280: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204065.99300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204065.99316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204065.99340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204065.99393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204065.99409: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204065.99425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204065.99447: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204065.99460: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204065.99475: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204065.99488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204065.99504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204065.99524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204065.99563: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204065.99582: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204065.99596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204065.99682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204065.99707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204065.99735: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204065.99841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204066.01740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204066.01744: stdout chunk (state=3): >>><<< 12180 1727204066.01747: stderr chunk (state=3): >>><<< 12180 1727204066.01971: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204066.01975: handler run complete 12180 1727204066.01977: Evaluated conditional (False): False 12180 1727204066.01980: attempt loop complete, returning result 12180 1727204066.01982: _execute() done 12180 1727204066.01988: dumping result to json 12180 1727204066.01991: done dumping result, returning 12180 1727204066.01993: done running TaskExecutor() for managed-node1/TASK: Create test interfaces [0affcd87-79f5-ccb1-55ae-000000000012] 12180 1727204066.01995: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000012 12180 1727204066.02083: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000012 12180 1727204066.02088: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.408888", "end": "2024-09-24 14:54:25.962871", "rc": 0, "start": "2024-09-24 14:54:24.553983" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 619 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 619 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 12180 1727204066.02170: no more pending results, returning what we have 12180 1727204066.02173: results queue empty 12180 1727204066.02174: checking for any_errors_fatal 12180 1727204066.02180: done checking for any_errors_fatal 12180 1727204066.02181: checking for max_fail_percentage 12180 1727204066.02183: done checking for max_fail_percentage 12180 1727204066.02183: checking to see if all hosts have failed and the running result is not ok 12180 1727204066.02184: done checking to see if all hosts have failed 12180 1727204066.02185: getting the remaining hosts for this loop 12180 1727204066.02187: done getting the remaining hosts for this loop 12180 1727204066.02190: getting the next task for host managed-node1 12180 1727204066.02199: done getting next task for host managed-node1 12180 1727204066.02202: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12180 1727204066.02205: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204066.02210: getting variables 12180 1727204066.02212: in VariableManager get_vars() 12180 1727204066.02251: Calling all_inventory to load vars for managed-node1 12180 1727204066.02254: Calling groups_inventory to load vars for managed-node1 12180 1727204066.02257: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204066.02269: Calling all_plugins_play to load vars for managed-node1 12180 1727204066.02271: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204066.02274: Calling groups_plugins_play to load vars for managed-node1 12180 1727204066.02482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204066.02686: done with get_vars() 12180 1727204066.02704: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:26 -0400 (0:00:02.248) 0:00:13.439 ***** 12180 1727204066.02918: entering _queue_task() for managed-node1/include_tasks 12180 1727204066.03426: worker is 1 (out of 1 available) 12180 1727204066.03439: exiting _queue_task() for managed-node1/include_tasks 12180 1727204066.03589: done queuing things up, now waiting for results queue to drain 12180 1727204066.03592: waiting for pending results... 12180 1727204066.04467: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 12180 1727204066.04597: in run() - task 0affcd87-79f5-ccb1-55ae-000000000016 12180 1727204066.04787: variable 'ansible_search_path' from source: unknown 12180 1727204066.04795: variable 'ansible_search_path' from source: unknown 12180 1727204066.04842: calling self._execute() 12180 1727204066.04927: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204066.05280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204066.05296: variable 'omit' from source: magic vars 12180 1727204066.05684: variable 'ansible_distribution_major_version' from source: facts 12180 1727204066.05995: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204066.06180: _execute() done 12180 1727204066.06188: dumping result to json 12180 1727204066.06197: done dumping result, returning 12180 1727204066.06207: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-ccb1-55ae-000000000016] 12180 1727204066.06217: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000016 12180 1727204066.06338: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000016 12180 1727204066.06345: WORKER PROCESS EXITING 12180 1727204066.06384: no more pending results, returning what we have 12180 1727204066.06388: in VariableManager get_vars() 12180 1727204066.06436: Calling all_inventory to load vars for managed-node1 12180 1727204066.06439: Calling groups_inventory to load vars for managed-node1 12180 1727204066.06441: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204066.06455: Calling all_plugins_play to load vars for managed-node1 12180 1727204066.06457: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204066.06460: Calling groups_plugins_play to load vars for managed-node1 12180 1727204066.06634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204066.06840: done with get_vars() 12180 1727204066.06850: variable 'ansible_search_path' from source: unknown 12180 1727204066.06851: variable 'ansible_search_path' from source: unknown 12180 1727204066.06899: we have included files to process 12180 1727204066.06901: generating all_blocks data 12180 1727204066.06902: done generating all_blocks data 12180 1727204066.06903: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12180 1727204066.06904: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12180 1727204066.06906: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12180 1727204066.07387: done processing included file 12180 1727204066.07389: iterating over new_blocks loaded from include file 12180 1727204066.07391: in VariableManager get_vars() 12180 1727204066.07413: done with get_vars() 12180 1727204066.07415: filtering new block on tags 12180 1727204066.07655: done filtering new block on tags 12180 1727204066.07658: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 12180 1727204066.07665: extending task lists for all hosts with included blocks 12180 1727204066.07892: done extending task lists 12180 1727204066.07894: done processing included files 12180 1727204066.07895: results queue empty 12180 1727204066.07895: checking for any_errors_fatal 12180 1727204066.07903: done checking for any_errors_fatal 12180 1727204066.07904: checking for max_fail_percentage 12180 1727204066.07905: done checking for max_fail_percentage 12180 1727204066.07906: checking to see if all hosts have failed and the running result is not ok 12180 1727204066.07907: done checking to see if all hosts have failed 12180 1727204066.07908: getting the remaining hosts for this loop 12180 1727204066.07909: done getting the remaining hosts for this loop 12180 1727204066.07911: getting the next task for host managed-node1 12180 1727204066.07916: done getting next task for host managed-node1 12180 1727204066.07918: ^ task is: TASK: Get stat for interface {{ interface }} 12180 1727204066.07920: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204066.07923: getting variables 12180 1727204066.07924: in VariableManager get_vars() 12180 1727204066.07940: Calling all_inventory to load vars for managed-node1 12180 1727204066.07943: Calling groups_inventory to load vars for managed-node1 12180 1727204066.07945: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204066.07951: Calling all_plugins_play to load vars for managed-node1 12180 1727204066.07953: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204066.07956: Calling groups_plugins_play to load vars for managed-node1 12180 1727204066.08462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204066.09106: done with get_vars() 12180 1727204066.09119: done getting variables 12180 1727204066.09723: variable 'interface' from source: task vars 12180 1727204066.09728: variable 'dhcp_interface1' from source: play vars 12180 1727204066.09801: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.069) 0:00:13.511 ***** 12180 1727204066.09912: entering _queue_task() for managed-node1/stat 12180 1727204066.10656: worker is 1 (out of 1 available) 12180 1727204066.10671: exiting _queue_task() for managed-node1/stat 12180 1727204066.10684: done queuing things up, now waiting for results queue to drain 12180 1727204066.10686: waiting for pending results... 12180 1727204066.11667: running TaskExecutor() for managed-node1/TASK: Get stat for interface test1 12180 1727204066.12093: in run() - task 0affcd87-79f5-ccb1-55ae-000000000153 12180 1727204066.12137: variable 'ansible_search_path' from source: unknown 12180 1727204066.12184: variable 'ansible_search_path' from source: unknown 12180 1727204066.12269: calling self._execute() 12180 1727204066.12423: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204066.12567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204066.12582: variable 'omit' from source: magic vars 12180 1727204066.13273: variable 'ansible_distribution_major_version' from source: facts 12180 1727204066.13443: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204066.13455: variable 'omit' from source: magic vars 12180 1727204066.13523: variable 'omit' from source: magic vars 12180 1727204066.13840: variable 'interface' from source: task vars 12180 1727204066.13982: variable 'dhcp_interface1' from source: play vars 12180 1727204066.14054: variable 'dhcp_interface1' from source: play vars 12180 1727204066.14241: variable 'omit' from source: magic vars 12180 1727204066.14292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204066.14340: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204066.14369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204066.14431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204066.14517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204066.14587: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204066.14622: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204066.14669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204066.14919: Set connection var ansible_pipelining to False 12180 1727204066.14928: Set connection var ansible_shell_type to sh 12180 1727204066.14939: Set connection var ansible_timeout to 10 12180 1727204066.14949: Set connection var ansible_connection to ssh 12180 1727204066.14959: Set connection var ansible_shell_executable to /bin/sh 12180 1727204066.14972: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204066.15098: variable 'ansible_shell_executable' from source: unknown 12180 1727204066.15108: variable 'ansible_connection' from source: unknown 12180 1727204066.15115: variable 'ansible_module_compression' from source: unknown 12180 1727204066.15122: variable 'ansible_shell_type' from source: unknown 12180 1727204066.15128: variable 'ansible_shell_executable' from source: unknown 12180 1727204066.15134: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204066.15141: variable 'ansible_pipelining' from source: unknown 12180 1727204066.15147: variable 'ansible_timeout' from source: unknown 12180 1727204066.15155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204066.15496: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204066.15513: variable 'omit' from source: magic vars 12180 1727204066.15535: starting attempt loop 12180 1727204066.15543: running the handler 12180 1727204066.15561: _low_level_execute_command(): starting 12180 1727204066.15578: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204066.16482: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204066.16510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.16533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.16554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.16601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.16613: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204066.17385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.17415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204066.17522: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204066.17536: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204066.17551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.17567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.17584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.17597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.17612: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204066.17633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.17715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204066.17746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204066.17783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204066.18071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204066.19604: stdout chunk (state=3): >>>/root <<< 12180 1727204066.19803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204066.19807: stdout chunk (state=3): >>><<< 12180 1727204066.19809: stderr chunk (state=3): >>><<< 12180 1727204066.19932: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204066.19936: _low_level_execute_command(): starting 12180 1727204066.19939: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204066.1983106-13847-177102399268735 `" && echo ansible-tmp-1727204066.1983106-13847-177102399268735="` echo /root/.ansible/tmp/ansible-tmp-1727204066.1983106-13847-177102399268735 `" ) && sleep 0' 12180 1727204066.21582: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.21586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.21624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.21627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.21630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.21685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204066.22193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204066.22196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204066.22271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204066.24145: stdout chunk (state=3): >>>ansible-tmp-1727204066.1983106-13847-177102399268735=/root/.ansible/tmp/ansible-tmp-1727204066.1983106-13847-177102399268735 <<< 12180 1727204066.24284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204066.24342: stderr chunk (state=3): >>><<< 12180 1727204066.24345: stdout chunk (state=3): >>><<< 12180 1727204066.24368: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204066.1983106-13847-177102399268735=/root/.ansible/tmp/ansible-tmp-1727204066.1983106-13847-177102399268735 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204066.24419: variable 'ansible_module_compression' from source: unknown 12180 1727204066.24485: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12180 1727204066.24520: variable 'ansible_facts' from source: unknown 12180 1727204066.24611: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204066.1983106-13847-177102399268735/AnsiballZ_stat.py 12180 1727204066.25236: Sending initial data 12180 1727204066.25240: Sent initial data (153 bytes) 12180 1727204066.27973: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204066.28389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.28399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.28414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.28457: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.28463: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204066.28475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.28488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204066.28496: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204066.28502: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204066.28509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.28518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.28535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.28538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.28541: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204066.28549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.28619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204066.28638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204066.28651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204066.28734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204066.30479: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204066.30540: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204066.30595: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpq11y91gi /root/.ansible/tmp/ansible-tmp-1727204066.1983106-13847-177102399268735/AnsiballZ_stat.py <<< 12180 1727204066.30648: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204066.32008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204066.32098: stderr chunk (state=3): >>><<< 12180 1727204066.32102: stdout chunk (state=3): >>><<< 12180 1727204066.32126: done transferring module to remote 12180 1727204066.32138: _low_level_execute_command(): starting 12180 1727204066.32143: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204066.1983106-13847-177102399268735/ /root/.ansible/tmp/ansible-tmp-1727204066.1983106-13847-177102399268735/AnsiballZ_stat.py && sleep 0' 12180 1727204066.33823: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204066.33889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.33906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.33930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.33980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.34109: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204066.34123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.34141: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204066.34157: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204066.34173: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204066.34185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.34199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.34220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.34232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.34243: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204066.34255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.34445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204066.34462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204066.34478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204066.34651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204066.36454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204066.36458: stdout chunk (state=3): >>><<< 12180 1727204066.36461: stderr chunk (state=3): >>><<< 12180 1727204066.36561: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204066.36567: _low_level_execute_command(): starting 12180 1727204066.36569: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204066.1983106-13847-177102399268735/AnsiballZ_stat.py && sleep 0' 12180 1727204066.39026: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204066.39047: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.39067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.39094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.39141: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.39283: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204066.39307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.39325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204066.39426: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204066.39438: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204066.39450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.39467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.39484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.39496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.39507: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204066.39528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.39608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204066.39759: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204066.39776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204066.39974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204066.53110: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25120, "dev": 21, "nlink": 1, "atime": 1727204064.56237, "mtime": 1727204064.56237, "ctime": 1727204064.56237, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12180 1727204066.54115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204066.54120: stdout chunk (state=3): >>><<< 12180 1727204066.54123: stderr chunk (state=3): >>><<< 12180 1727204066.54281: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25120, "dev": 21, "nlink": 1, "atime": 1727204064.56237, "mtime": 1727204064.56237, "ctime": 1727204064.56237, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204066.54290: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204066.1983106-13847-177102399268735/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204066.54293: _low_level_execute_command(): starting 12180 1727204066.54296: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204066.1983106-13847-177102399268735/ > /dev/null 2>&1 && sleep 0' 12180 1727204066.54882: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204066.54898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.54913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.54934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.54983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.55006: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204066.55025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.55045: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204066.55057: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204066.55072: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204066.55085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.55098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.55113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.55125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.55140: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204066.55153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.55233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204066.55253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204066.55272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204066.55371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204066.57201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204066.57250: stderr chunk (state=3): >>><<< 12180 1727204066.57254: stdout chunk (state=3): >>><<< 12180 1727204066.57286: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204066.57290: handler run complete 12180 1727204066.57396: attempt loop complete, returning result 12180 1727204066.57401: _execute() done 12180 1727204066.57403: dumping result to json 12180 1727204066.57405: done dumping result, returning 12180 1727204066.57416: done running TaskExecutor() for managed-node1/TASK: Get stat for interface test1 [0affcd87-79f5-ccb1-55ae-000000000153] 12180 1727204066.57423: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000153 12180 1727204066.57547: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000153 12180 1727204066.57550: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204064.56237, "block_size": 4096, "blocks": 0, "ctime": 1727204064.56237, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 25120, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727204064.56237, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12180 1727204066.57638: no more pending results, returning what we have 12180 1727204066.57642: results queue empty 12180 1727204066.57642: checking for any_errors_fatal 12180 1727204066.57643: done checking for any_errors_fatal 12180 1727204066.57644: checking for max_fail_percentage 12180 1727204066.57646: done checking for max_fail_percentage 12180 1727204066.57647: checking to see if all hosts have failed and the running result is not ok 12180 1727204066.57648: done checking to see if all hosts have failed 12180 1727204066.57648: getting the remaining hosts for this loop 12180 1727204066.57650: done getting the remaining hosts for this loop 12180 1727204066.57653: getting the next task for host managed-node1 12180 1727204066.57661: done getting next task for host managed-node1 12180 1727204066.57663: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12180 1727204066.57667: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204066.57672: getting variables 12180 1727204066.57673: in VariableManager get_vars() 12180 1727204066.57718: Calling all_inventory to load vars for managed-node1 12180 1727204066.57726: Calling groups_inventory to load vars for managed-node1 12180 1727204066.57728: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204066.57737: Calling all_plugins_play to load vars for managed-node1 12180 1727204066.57740: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204066.57742: Calling groups_plugins_play to load vars for managed-node1 12180 1727204066.57926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204066.58162: done with get_vars() 12180 1727204066.58176: done getting variables 12180 1727204066.58273: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 12180 1727204066.58393: variable 'interface' from source: task vars 12180 1727204066.58397: variable 'dhcp_interface1' from source: play vars 12180 1727204066.58471: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.485) 0:00:13.996 ***** 12180 1727204066.58503: entering _queue_task() for managed-node1/assert 12180 1727204066.58504: Creating lock for assert 12180 1727204066.58783: worker is 1 (out of 1 available) 12180 1727204066.58795: exiting _queue_task() for managed-node1/assert 12180 1727204066.58808: done queuing things up, now waiting for results queue to drain 12180 1727204066.58809: waiting for pending results... 12180 1727204066.59074: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test1' 12180 1727204066.59197: in run() - task 0affcd87-79f5-ccb1-55ae-000000000017 12180 1727204066.59217: variable 'ansible_search_path' from source: unknown 12180 1727204066.59225: variable 'ansible_search_path' from source: unknown 12180 1727204066.59270: calling self._execute() 12180 1727204066.59357: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204066.59377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204066.59392: variable 'omit' from source: magic vars 12180 1727204066.59753: variable 'ansible_distribution_major_version' from source: facts 12180 1727204066.59773: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204066.59784: variable 'omit' from source: magic vars 12180 1727204066.59836: variable 'omit' from source: magic vars 12180 1727204066.59940: variable 'interface' from source: task vars 12180 1727204066.59950: variable 'dhcp_interface1' from source: play vars 12180 1727204066.60026: variable 'dhcp_interface1' from source: play vars 12180 1727204066.60051: variable 'omit' from source: magic vars 12180 1727204066.60102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204066.60147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204066.60175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204066.60197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204066.60211: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204066.60259: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204066.60271: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204066.60278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204066.60388: Set connection var ansible_pipelining to False 12180 1727204066.60396: Set connection var ansible_shell_type to sh 12180 1727204066.60407: Set connection var ansible_timeout to 10 12180 1727204066.60417: Set connection var ansible_connection to ssh 12180 1727204066.60426: Set connection var ansible_shell_executable to /bin/sh 12180 1727204066.60438: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204066.60476: variable 'ansible_shell_executable' from source: unknown 12180 1727204066.60483: variable 'ansible_connection' from source: unknown 12180 1727204066.60489: variable 'ansible_module_compression' from source: unknown 12180 1727204066.60495: variable 'ansible_shell_type' from source: unknown 12180 1727204066.60501: variable 'ansible_shell_executable' from source: unknown 12180 1727204066.60507: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204066.60514: variable 'ansible_pipelining' from source: unknown 12180 1727204066.60524: variable 'ansible_timeout' from source: unknown 12180 1727204066.60539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204066.60692: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204066.60707: variable 'omit' from source: magic vars 12180 1727204066.60716: starting attempt loop 12180 1727204066.60721: running the handler 12180 1727204066.60861: variable 'interface_stat' from source: set_fact 12180 1727204066.60892: Evaluated conditional (interface_stat.stat.exists): True 12180 1727204066.60902: handler run complete 12180 1727204066.60920: attempt loop complete, returning result 12180 1727204066.60927: _execute() done 12180 1727204066.60934: dumping result to json 12180 1727204066.60941: done dumping result, returning 12180 1727204066.60952: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test1' [0affcd87-79f5-ccb1-55ae-000000000017] 12180 1727204066.60961: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000017 12180 1727204066.61074: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000017 12180 1727204066.61081: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12180 1727204066.61142: no more pending results, returning what we have 12180 1727204066.61147: results queue empty 12180 1727204066.61148: checking for any_errors_fatal 12180 1727204066.61157: done checking for any_errors_fatal 12180 1727204066.61158: checking for max_fail_percentage 12180 1727204066.61160: done checking for max_fail_percentage 12180 1727204066.61162: checking to see if all hosts have failed and the running result is not ok 12180 1727204066.61163: done checking to see if all hosts have failed 12180 1727204066.61166: getting the remaining hosts for this loop 12180 1727204066.61168: done getting the remaining hosts for this loop 12180 1727204066.61172: getting the next task for host managed-node1 12180 1727204066.61181: done getting next task for host managed-node1 12180 1727204066.61184: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12180 1727204066.61188: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204066.61193: getting variables 12180 1727204066.61194: in VariableManager get_vars() 12180 1727204066.61245: Calling all_inventory to load vars for managed-node1 12180 1727204066.61249: Calling groups_inventory to load vars for managed-node1 12180 1727204066.61252: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204066.61265: Calling all_plugins_play to load vars for managed-node1 12180 1727204066.61268: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204066.61271: Calling groups_plugins_play to load vars for managed-node1 12180 1727204066.61461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204066.61674: done with get_vars() 12180 1727204066.61687: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.032) 0:00:14.029 ***** 12180 1727204066.61789: entering _queue_task() for managed-node1/include_tasks 12180 1727204066.62378: worker is 1 (out of 1 available) 12180 1727204066.62392: exiting _queue_task() for managed-node1/include_tasks 12180 1727204066.62404: done queuing things up, now waiting for results queue to drain 12180 1727204066.62406: waiting for pending results... 12180 1727204066.62738: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 12180 1727204066.62844: in run() - task 0affcd87-79f5-ccb1-55ae-00000000001b 12180 1727204066.62870: variable 'ansible_search_path' from source: unknown 12180 1727204066.62878: variable 'ansible_search_path' from source: unknown 12180 1727204066.62918: calling self._execute() 12180 1727204066.63012: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204066.63023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204066.63037: variable 'omit' from source: magic vars 12180 1727204066.63489: variable 'ansible_distribution_major_version' from source: facts 12180 1727204066.63512: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204066.63525: _execute() done 12180 1727204066.63533: dumping result to json 12180 1727204066.63540: done dumping result, returning 12180 1727204066.63551: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-ccb1-55ae-00000000001b] 12180 1727204066.63561: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000001b 12180 1727204066.63699: no more pending results, returning what we have 12180 1727204066.63705: in VariableManager get_vars() 12180 1727204066.63757: Calling all_inventory to load vars for managed-node1 12180 1727204066.63761: Calling groups_inventory to load vars for managed-node1 12180 1727204066.63765: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204066.63781: Calling all_plugins_play to load vars for managed-node1 12180 1727204066.63784: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204066.63787: Calling groups_plugins_play to load vars for managed-node1 12180 1727204066.64045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204066.64247: done with get_vars() 12180 1727204066.64255: variable 'ansible_search_path' from source: unknown 12180 1727204066.64256: variable 'ansible_search_path' from source: unknown 12180 1727204066.64298: we have included files to process 12180 1727204066.64300: generating all_blocks data 12180 1727204066.64302: done generating all_blocks data 12180 1727204066.64307: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12180 1727204066.64309: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12180 1727204066.64311: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12180 1727204066.64651: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000001b 12180 1727204066.64655: WORKER PROCESS EXITING 12180 1727204066.64768: done processing included file 12180 1727204066.64771: iterating over new_blocks loaded from include file 12180 1727204066.64773: in VariableManager get_vars() 12180 1727204066.64794: done with get_vars() 12180 1727204066.64796: filtering new block on tags 12180 1727204066.64814: done filtering new block on tags 12180 1727204066.64816: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 12180 1727204066.64821: extending task lists for all hosts with included blocks 12180 1727204066.65012: done extending task lists 12180 1727204066.65014: done processing included files 12180 1727204066.65015: results queue empty 12180 1727204066.65015: checking for any_errors_fatal 12180 1727204066.65019: done checking for any_errors_fatal 12180 1727204066.65019: checking for max_fail_percentage 12180 1727204066.65020: done checking for max_fail_percentage 12180 1727204066.65021: checking to see if all hosts have failed and the running result is not ok 12180 1727204066.65022: done checking to see if all hosts have failed 12180 1727204066.65023: getting the remaining hosts for this loop 12180 1727204066.65024: done getting the remaining hosts for this loop 12180 1727204066.65026: getting the next task for host managed-node1 12180 1727204066.65030: done getting next task for host managed-node1 12180 1727204066.65032: ^ task is: TASK: Get stat for interface {{ interface }} 12180 1727204066.65035: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204066.65037: getting variables 12180 1727204066.65038: in VariableManager get_vars() 12180 1727204066.65052: Calling all_inventory to load vars for managed-node1 12180 1727204066.65054: Calling groups_inventory to load vars for managed-node1 12180 1727204066.65056: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204066.65062: Calling all_plugins_play to load vars for managed-node1 12180 1727204066.65066: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204066.65069: Calling groups_plugins_play to load vars for managed-node1 12180 1727204066.65211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204066.65409: done with get_vars() 12180 1727204066.65418: done getting variables 12180 1727204066.65889: variable 'interface' from source: task vars 12180 1727204066.65893: variable 'dhcp_interface2' from source: play vars 12180 1727204066.65953: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.041) 0:00:14.071 ***** 12180 1727204066.65987: entering _queue_task() for managed-node1/stat 12180 1727204066.66341: worker is 1 (out of 1 available) 12180 1727204066.66352: exiting _queue_task() for managed-node1/stat 12180 1727204066.66367: done queuing things up, now waiting for results queue to drain 12180 1727204066.66369: waiting for pending results... 12180 1727204066.66669: running TaskExecutor() for managed-node1/TASK: Get stat for interface test2 12180 1727204066.66795: in run() - task 0affcd87-79f5-ccb1-55ae-00000000016b 12180 1727204066.66819: variable 'ansible_search_path' from source: unknown 12180 1727204066.66827: variable 'ansible_search_path' from source: unknown 12180 1727204066.66868: calling self._execute() 12180 1727204066.66952: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204066.66966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204066.66982: variable 'omit' from source: magic vars 12180 1727204066.67409: variable 'ansible_distribution_major_version' from source: facts 12180 1727204066.67426: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204066.67436: variable 'omit' from source: magic vars 12180 1727204066.67499: variable 'omit' from source: magic vars 12180 1727204066.67670: variable 'interface' from source: task vars 12180 1727204066.67681: variable 'dhcp_interface2' from source: play vars 12180 1727204066.67751: variable 'dhcp_interface2' from source: play vars 12180 1727204066.67776: variable 'omit' from source: magic vars 12180 1727204066.67826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204066.67868: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204066.67893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204066.67913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204066.67933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204066.67968: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204066.67977: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204066.67984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204066.68089: Set connection var ansible_pipelining to False 12180 1727204066.68097: Set connection var ansible_shell_type to sh 12180 1727204066.68107: Set connection var ansible_timeout to 10 12180 1727204066.68115: Set connection var ansible_connection to ssh 12180 1727204066.68123: Set connection var ansible_shell_executable to /bin/sh 12180 1727204066.68132: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204066.68170: variable 'ansible_shell_executable' from source: unknown 12180 1727204066.68178: variable 'ansible_connection' from source: unknown 12180 1727204066.68183: variable 'ansible_module_compression' from source: unknown 12180 1727204066.68188: variable 'ansible_shell_type' from source: unknown 12180 1727204066.68197: variable 'ansible_shell_executable' from source: unknown 12180 1727204066.68203: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204066.68210: variable 'ansible_pipelining' from source: unknown 12180 1727204066.68216: variable 'ansible_timeout' from source: unknown 12180 1727204066.68223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204066.68706: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204066.68720: variable 'omit' from source: magic vars 12180 1727204066.68730: starting attempt loop 12180 1727204066.68736: running the handler 12180 1727204066.68752: _low_level_execute_command(): starting 12180 1727204066.68766: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204066.69525: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204066.69590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.69607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.69627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.69679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.69691: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204066.69704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.69721: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204066.69732: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204066.69743: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204066.69755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.69776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.69793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.69806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.69818: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204066.69832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.69912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204066.69937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204066.69954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204066.70049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204066.71624: stdout chunk (state=3): >>>/root <<< 12180 1727204066.71783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204066.71824: stderr chunk (state=3): >>><<< 12180 1727204066.71827: stdout chunk (state=3): >>><<< 12180 1727204066.71942: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204066.71946: _low_level_execute_command(): starting 12180 1727204066.71949: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204066.7184727-13895-122584660000837 `" && echo ansible-tmp-1727204066.7184727-13895-122584660000837="` echo /root/.ansible/tmp/ansible-tmp-1727204066.7184727-13895-122584660000837 `" ) && sleep 0' 12180 1727204066.72720: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204066.72735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.72753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.72776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.72824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.72837: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204066.72852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.72874: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204066.72888: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204066.72900: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204066.72917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.72932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.72949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.72961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.72976: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204066.72990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.73070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204066.73095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204066.73113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204066.73204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204066.75042: stdout chunk (state=3): >>>ansible-tmp-1727204066.7184727-13895-122584660000837=/root/.ansible/tmp/ansible-tmp-1727204066.7184727-13895-122584660000837 <<< 12180 1727204066.75184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204066.75258: stderr chunk (state=3): >>><<< 12180 1727204066.75261: stdout chunk (state=3): >>><<< 12180 1727204066.75472: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204066.7184727-13895-122584660000837=/root/.ansible/tmp/ansible-tmp-1727204066.7184727-13895-122584660000837 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204066.75476: variable 'ansible_module_compression' from source: unknown 12180 1727204066.75479: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12180 1727204066.75481: variable 'ansible_facts' from source: unknown 12180 1727204066.75534: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204066.7184727-13895-122584660000837/AnsiballZ_stat.py 12180 1727204066.76280: Sending initial data 12180 1727204066.76290: Sent initial data (153 bytes) 12180 1727204066.77347: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204066.77366: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.77382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.77400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.77443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.77459: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204066.77480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.77499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204066.77511: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204066.77554: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204066.77570: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.77589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.77604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.77617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.77628: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204066.77641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.77721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204066.77744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204066.77759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204066.77848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204066.79569: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204066.79613: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204066.79668: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpyyxqy629 /root/.ansible/tmp/ansible-tmp-1727204066.7184727-13895-122584660000837/AnsiballZ_stat.py <<< 12180 1727204066.79715: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204066.81174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204066.81178: stderr chunk (state=3): >>><<< 12180 1727204066.81186: stdout chunk (state=3): >>><<< 12180 1727204066.81206: done transferring module to remote 12180 1727204066.81218: _low_level_execute_command(): starting 12180 1727204066.81226: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204066.7184727-13895-122584660000837/ /root/.ansible/tmp/ansible-tmp-1727204066.7184727-13895-122584660000837/AnsiballZ_stat.py && sleep 0' 12180 1727204066.82834: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204066.82893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.82897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.82933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204066.82937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.82940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 12180 1727204066.82943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.83124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204066.83128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204066.83130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204066.83224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204066.84987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204066.85036: stderr chunk (state=3): >>><<< 12180 1727204066.85039: stdout chunk (state=3): >>><<< 12180 1727204066.85070: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204066.85073: _low_level_execute_command(): starting 12180 1727204066.85075: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204066.7184727-13895-122584660000837/AnsiballZ_stat.py && sleep 0' 12180 1727204066.87097: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.87101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.87123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204066.87239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.87243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204066.87252: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204066.87259: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204066.87268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204066.87278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204066.87349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204066.87360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204066.87369: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204066.87380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204066.87566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204066.87590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204066.87603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204066.87701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204067.00785: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25706, "dev": 21, "nlink": 1, "atime": 1727204064.5693028, "mtime": 1727204064.5693028, "ctime": 1727204064.5693028, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12180 1727204067.01697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204067.01785: stderr chunk (state=3): >>><<< 12180 1727204067.01789: stdout chunk (state=3): >>><<< 12180 1727204067.01817: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25706, "dev": 21, "nlink": 1, "atime": 1727204064.5693028, "mtime": 1727204064.5693028, "ctime": 1727204064.5693028, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204067.01879: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204066.7184727-13895-122584660000837/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204067.01888: _low_level_execute_command(): starting 12180 1727204067.01894: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204066.7184727-13895-122584660000837/ > /dev/null 2>&1 && sleep 0' 12180 1727204067.03442: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204067.03480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204067.03496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204067.03516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204067.04347: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204067.04435: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204067.04450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204067.04470: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204067.04483: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204067.04495: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204067.04507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204067.04520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204067.04544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204067.04557: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204067.04572: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204067.04585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204067.04779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204067.04878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204067.04894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204067.04982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204067.06850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204067.06854: stdout chunk (state=3): >>><<< 12180 1727204067.06857: stderr chunk (state=3): >>><<< 12180 1727204067.07175: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204067.07178: handler run complete 12180 1727204067.07181: attempt loop complete, returning result 12180 1727204067.07183: _execute() done 12180 1727204067.07185: dumping result to json 12180 1727204067.07187: done dumping result, returning 12180 1727204067.07189: done running TaskExecutor() for managed-node1/TASK: Get stat for interface test2 [0affcd87-79f5-ccb1-55ae-00000000016b] 12180 1727204067.07191: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000016b 12180 1727204067.07267: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000016b 12180 1727204067.07271: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204064.5693028, "block_size": 4096, "blocks": 0, "ctime": 1727204064.5693028, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 25706, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727204064.5693028, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12180 1727204067.07375: no more pending results, returning what we have 12180 1727204067.07379: results queue empty 12180 1727204067.07380: checking for any_errors_fatal 12180 1727204067.07381: done checking for any_errors_fatal 12180 1727204067.07382: checking for max_fail_percentage 12180 1727204067.07384: done checking for max_fail_percentage 12180 1727204067.07385: checking to see if all hosts have failed and the running result is not ok 12180 1727204067.07386: done checking to see if all hosts have failed 12180 1727204067.07386: getting the remaining hosts for this loop 12180 1727204067.07388: done getting the remaining hosts for this loop 12180 1727204067.07391: getting the next task for host managed-node1 12180 1727204067.07399: done getting next task for host managed-node1 12180 1727204067.07401: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12180 1727204067.07404: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204067.07408: getting variables 12180 1727204067.07409: in VariableManager get_vars() 12180 1727204067.07451: Calling all_inventory to load vars for managed-node1 12180 1727204067.07459: Calling groups_inventory to load vars for managed-node1 12180 1727204067.07462: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204067.07480: Calling all_plugins_play to load vars for managed-node1 12180 1727204067.07483: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204067.07487: Calling groups_plugins_play to load vars for managed-node1 12180 1727204067.07802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204067.08010: done with get_vars() 12180 1727204067.08020: done getting variables 12180 1727204067.08081: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204067.08312: variable 'interface' from source: task vars 12180 1727204067.08429: variable 'dhcp_interface2' from source: play vars 12180 1727204067.08495: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.425) 0:00:14.497 ***** 12180 1727204067.08527: entering _queue_task() for managed-node1/assert 12180 1727204067.09126: worker is 1 (out of 1 available) 12180 1727204067.09137: exiting _queue_task() for managed-node1/assert 12180 1727204067.09150: done queuing things up, now waiting for results queue to drain 12180 1727204067.09151: waiting for pending results... 12180 1727204067.09982: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test2' 12180 1727204067.10215: in run() - task 0affcd87-79f5-ccb1-55ae-00000000001c 12180 1727204067.10233: variable 'ansible_search_path' from source: unknown 12180 1727204067.10240: variable 'ansible_search_path' from source: unknown 12180 1727204067.10308: calling self._execute() 12180 1727204067.10468: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204067.10609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204067.10623: variable 'omit' from source: magic vars 12180 1727204067.11321: variable 'ansible_distribution_major_version' from source: facts 12180 1727204067.11340: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204067.11357: variable 'omit' from source: magic vars 12180 1727204067.11408: variable 'omit' from source: magic vars 12180 1727204067.11554: variable 'interface' from source: task vars 12180 1727204067.11690: variable 'dhcp_interface2' from source: play vars 12180 1727204067.11759: variable 'dhcp_interface2' from source: play vars 12180 1727204067.11920: variable 'omit' from source: magic vars 12180 1727204067.11966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204067.12124: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204067.12149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204067.12173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204067.12188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204067.12227: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204067.12237: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204067.12341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204067.12557: Set connection var ansible_pipelining to False 12180 1727204067.12568: Set connection var ansible_shell_type to sh 12180 1727204067.12580: Set connection var ansible_timeout to 10 12180 1727204067.12589: Set connection var ansible_connection to ssh 12180 1727204067.12599: Set connection var ansible_shell_executable to /bin/sh 12180 1727204067.12608: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204067.12638: variable 'ansible_shell_executable' from source: unknown 12180 1727204067.12647: variable 'ansible_connection' from source: unknown 12180 1727204067.12655: variable 'ansible_module_compression' from source: unknown 12180 1727204067.12668: variable 'ansible_shell_type' from source: unknown 12180 1727204067.12675: variable 'ansible_shell_executable' from source: unknown 12180 1727204067.12681: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204067.12688: variable 'ansible_pipelining' from source: unknown 12180 1727204067.12694: variable 'ansible_timeout' from source: unknown 12180 1727204067.12701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204067.13023: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204067.13040: variable 'omit' from source: magic vars 12180 1727204067.13050: starting attempt loop 12180 1727204067.13057: running the handler 12180 1727204067.13429: variable 'interface_stat' from source: set_fact 12180 1727204067.13454: Evaluated conditional (interface_stat.stat.exists): True 12180 1727204067.13467: handler run complete 12180 1727204067.13487: attempt loop complete, returning result 12180 1727204067.13493: _execute() done 12180 1727204067.13499: dumping result to json 12180 1727204067.13507: done dumping result, returning 12180 1727204067.13517: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test2' [0affcd87-79f5-ccb1-55ae-00000000001c] 12180 1727204067.13526: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000001c ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12180 1727204067.13690: no more pending results, returning what we have 12180 1727204067.13695: results queue empty 12180 1727204067.13696: checking for any_errors_fatal 12180 1727204067.13705: done checking for any_errors_fatal 12180 1727204067.13706: checking for max_fail_percentage 12180 1727204067.13708: done checking for max_fail_percentage 12180 1727204067.13709: checking to see if all hosts have failed and the running result is not ok 12180 1727204067.13710: done checking to see if all hosts have failed 12180 1727204067.13711: getting the remaining hosts for this loop 12180 1727204067.13713: done getting the remaining hosts for this loop 12180 1727204067.13716: getting the next task for host managed-node1 12180 1727204067.13724: done getting next task for host managed-node1 12180 1727204067.13727: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 12180 1727204067.13729: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204067.13732: getting variables 12180 1727204067.13734: in VariableManager get_vars() 12180 1727204067.13780: Calling all_inventory to load vars for managed-node1 12180 1727204067.13783: Calling groups_inventory to load vars for managed-node1 12180 1727204067.13785: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204067.13797: Calling all_plugins_play to load vars for managed-node1 12180 1727204067.13800: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204067.13803: Calling groups_plugins_play to load vars for managed-node1 12180 1727204067.13985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204067.14197: done with get_vars() 12180 1727204067.14209: done getting variables 12180 1727204067.14268: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:28 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.059) 0:00:14.557 ***** 12180 1727204067.14529: entering _queue_task() for managed-node1/command 12180 1727204067.14543: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000001c 12180 1727204067.14546: WORKER PROCESS EXITING 12180 1727204067.15122: worker is 1 (out of 1 available) 12180 1727204067.15135: exiting _queue_task() for managed-node1/command 12180 1727204067.15146: done queuing things up, now waiting for results queue to drain 12180 1727204067.15147: waiting for pending results... 12180 1727204067.16066: running TaskExecutor() for managed-node1/TASK: Backup the /etc/resolv.conf for initscript 12180 1727204067.16275: in run() - task 0affcd87-79f5-ccb1-55ae-00000000001d 12180 1727204067.16295: variable 'ansible_search_path' from source: unknown 12180 1727204067.16340: calling self._execute() 12180 1727204067.16550: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204067.16561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204067.16578: variable 'omit' from source: magic vars 12180 1727204067.17589: variable 'ansible_distribution_major_version' from source: facts 12180 1727204067.17608: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204067.17738: variable 'network_provider' from source: set_fact 12180 1727204067.17902: Evaluated conditional (network_provider == "initscripts"): False 12180 1727204067.17910: when evaluation is False, skipping this task 12180 1727204067.17918: _execute() done 12180 1727204067.17925: dumping result to json 12180 1727204067.17932: done dumping result, returning 12180 1727204067.17943: done running TaskExecutor() for managed-node1/TASK: Backup the /etc/resolv.conf for initscript [0affcd87-79f5-ccb1-55ae-00000000001d] 12180 1727204067.17953: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000001d skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12180 1727204067.18109: no more pending results, returning what we have 12180 1727204067.18113: results queue empty 12180 1727204067.18115: checking for any_errors_fatal 12180 1727204067.18121: done checking for any_errors_fatal 12180 1727204067.18121: checking for max_fail_percentage 12180 1727204067.18123: done checking for max_fail_percentage 12180 1727204067.18124: checking to see if all hosts have failed and the running result is not ok 12180 1727204067.18125: done checking to see if all hosts have failed 12180 1727204067.18126: getting the remaining hosts for this loop 12180 1727204067.18127: done getting the remaining hosts for this loop 12180 1727204067.18131: getting the next task for host managed-node1 12180 1727204067.18138: done getting next task for host managed-node1 12180 1727204067.18141: ^ task is: TASK: TEST Add Bond with 2 ports using deprecated 'master' argument 12180 1727204067.18143: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204067.18145: getting variables 12180 1727204067.18148: in VariableManager get_vars() 12180 1727204067.18193: Calling all_inventory to load vars for managed-node1 12180 1727204067.18197: Calling groups_inventory to load vars for managed-node1 12180 1727204067.18199: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204067.18212: Calling all_plugins_play to load vars for managed-node1 12180 1727204067.18215: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204067.18219: Calling groups_plugins_play to load vars for managed-node1 12180 1727204067.18465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204067.18671: done with get_vars() 12180 1727204067.18682: done getting variables 12180 1727204067.18948: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000001d 12180 1727204067.18951: WORKER PROCESS EXITING 12180 1727204067.18996: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports using deprecated 'master' argument] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:33 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.046) 0:00:14.603 ***** 12180 1727204067.19142: entering _queue_task() for managed-node1/debug 12180 1727204067.19650: worker is 1 (out of 1 available) 12180 1727204067.19661: exiting _queue_task() for managed-node1/debug 12180 1727204067.19897: done queuing things up, now waiting for results queue to drain 12180 1727204067.19900: waiting for pending results... 12180 1727204067.21128: running TaskExecutor() for managed-node1/TASK: TEST Add Bond with 2 ports using deprecated 'master' argument 12180 1727204067.21282: in run() - task 0affcd87-79f5-ccb1-55ae-00000000001e 12180 1727204067.21532: variable 'ansible_search_path' from source: unknown 12180 1727204067.21575: calling self._execute() 12180 1727204067.21889: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204067.21900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204067.21915: variable 'omit' from source: magic vars 12180 1727204067.23047: variable 'ansible_distribution_major_version' from source: facts 12180 1727204067.23068: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204067.23161: variable 'omit' from source: magic vars 12180 1727204067.23188: variable 'omit' from source: magic vars 12180 1727204067.23227: variable 'omit' from source: magic vars 12180 1727204067.23305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204067.23521: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204067.23617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204067.23640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204067.23715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204067.23844: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204067.23926: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204067.23936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204067.24272: Set connection var ansible_pipelining to False 12180 1727204067.24281: Set connection var ansible_shell_type to sh 12180 1727204067.24292: Set connection var ansible_timeout to 10 12180 1727204067.24302: Set connection var ansible_connection to ssh 12180 1727204067.24310: Set connection var ansible_shell_executable to /bin/sh 12180 1727204067.24319: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204067.24386: variable 'ansible_shell_executable' from source: unknown 12180 1727204067.24579: variable 'ansible_connection' from source: unknown 12180 1727204067.24588: variable 'ansible_module_compression' from source: unknown 12180 1727204067.24595: variable 'ansible_shell_type' from source: unknown 12180 1727204067.24602: variable 'ansible_shell_executable' from source: unknown 12180 1727204067.24608: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204067.24617: variable 'ansible_pipelining' from source: unknown 12180 1727204067.24623: variable 'ansible_timeout' from source: unknown 12180 1727204067.24630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204067.24893: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204067.25133: variable 'omit' from source: magic vars 12180 1727204067.25145: starting attempt loop 12180 1727204067.25153: running the handler 12180 1727204067.25205: handler run complete 12180 1727204067.25356: attempt loop complete, returning result 12180 1727204067.25452: _execute() done 12180 1727204067.25459: dumping result to json 12180 1727204067.25470: done dumping result, returning 12180 1727204067.25482: done running TaskExecutor() for managed-node1/TASK: TEST Add Bond with 2 ports using deprecated 'master' argument [0affcd87-79f5-ccb1-55ae-00000000001e] 12180 1727204067.25493: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000001e ok: [managed-node1] => {} MSG: ################################################## 12180 1727204067.25642: no more pending results, returning what we have 12180 1727204067.25646: results queue empty 12180 1727204067.25647: checking for any_errors_fatal 12180 1727204067.25652: done checking for any_errors_fatal 12180 1727204067.25653: checking for max_fail_percentage 12180 1727204067.25655: done checking for max_fail_percentage 12180 1727204067.25656: checking to see if all hosts have failed and the running result is not ok 12180 1727204067.25657: done checking to see if all hosts have failed 12180 1727204067.25658: getting the remaining hosts for this loop 12180 1727204067.25660: done getting the remaining hosts for this loop 12180 1727204067.25665: getting the next task for host managed-node1 12180 1727204067.25674: done getting next task for host managed-node1 12180 1727204067.25680: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12180 1727204067.25684: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204067.25702: getting variables 12180 1727204067.25704: in VariableManager get_vars() 12180 1727204067.25750: Calling all_inventory to load vars for managed-node1 12180 1727204067.25754: Calling groups_inventory to load vars for managed-node1 12180 1727204067.25757: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204067.25769: Calling all_plugins_play to load vars for managed-node1 12180 1727204067.25772: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204067.25775: Calling groups_plugins_play to load vars for managed-node1 12180 1727204067.25951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204067.26166: done with get_vars() 12180 1727204067.26178: done getting variables 12180 1727204067.27172: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000001e 12180 1727204067.27175: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.080) 0:00:14.683 ***** 12180 1727204067.27200: entering _queue_task() for managed-node1/include_tasks 12180 1727204067.27644: worker is 1 (out of 1 available) 12180 1727204067.27657: exiting _queue_task() for managed-node1/include_tasks 12180 1727204067.27672: done queuing things up, now waiting for results queue to drain 12180 1727204067.27674: waiting for pending results... 12180 1727204067.28939: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12180 1727204067.29286: in run() - task 0affcd87-79f5-ccb1-55ae-000000000026 12180 1727204067.29304: variable 'ansible_search_path' from source: unknown 12180 1727204067.29344: variable 'ansible_search_path' from source: unknown 12180 1727204067.29388: calling self._execute() 12180 1727204067.29630: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204067.29779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204067.29793: variable 'omit' from source: magic vars 12180 1727204067.30829: variable 'ansible_distribution_major_version' from source: facts 12180 1727204067.30984: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204067.30994: _execute() done 12180 1727204067.31002: dumping result to json 12180 1727204067.31009: done dumping result, returning 12180 1727204067.31020: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-ccb1-55ae-000000000026] 12180 1727204067.31030: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000026 12180 1727204067.31169: no more pending results, returning what we have 12180 1727204067.31174: in VariableManager get_vars() 12180 1727204067.31227: Calling all_inventory to load vars for managed-node1 12180 1727204067.31231: Calling groups_inventory to load vars for managed-node1 12180 1727204067.31234: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204067.31247: Calling all_plugins_play to load vars for managed-node1 12180 1727204067.31250: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204067.31253: Calling groups_plugins_play to load vars for managed-node1 12180 1727204067.31477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204067.32075: done with get_vars() 12180 1727204067.32083: variable 'ansible_search_path' from source: unknown 12180 1727204067.32084: variable 'ansible_search_path' from source: unknown 12180 1727204067.32128: we have included files to process 12180 1727204067.32129: generating all_blocks data 12180 1727204067.32131: done generating all_blocks data 12180 1727204067.32136: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12180 1727204067.32137: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12180 1727204067.32139: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12180 1727204067.32984: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000026 12180 1727204067.32988: WORKER PROCESS EXITING 12180 1727204067.33402: done processing included file 12180 1727204067.33404: iterating over new_blocks loaded from include file 12180 1727204067.33406: in VariableManager get_vars() 12180 1727204067.33430: done with get_vars() 12180 1727204067.33432: filtering new block on tags 12180 1727204067.33449: done filtering new block on tags 12180 1727204067.33452: in VariableManager get_vars() 12180 1727204067.33476: done with get_vars() 12180 1727204067.33478: filtering new block on tags 12180 1727204067.33499: done filtering new block on tags 12180 1727204067.33501: in VariableManager get_vars() 12180 1727204067.33527: done with get_vars() 12180 1727204067.33528: filtering new block on tags 12180 1727204067.33548: done filtering new block on tags 12180 1727204067.33550: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 12180 1727204067.33555: extending task lists for all hosts with included blocks 12180 1727204067.35856: done extending task lists 12180 1727204067.35858: done processing included files 12180 1727204067.35859: results queue empty 12180 1727204067.35859: checking for any_errors_fatal 12180 1727204067.35863: done checking for any_errors_fatal 12180 1727204067.35865: checking for max_fail_percentage 12180 1727204067.35866: done checking for max_fail_percentage 12180 1727204067.35867: checking to see if all hosts have failed and the running result is not ok 12180 1727204067.35868: done checking to see if all hosts have failed 12180 1727204067.35869: getting the remaining hosts for this loop 12180 1727204067.35870: done getting the remaining hosts for this loop 12180 1727204067.35873: getting the next task for host managed-node1 12180 1727204067.35878: done getting next task for host managed-node1 12180 1727204067.35881: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12180 1727204067.35883: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204067.35893: getting variables 12180 1727204067.35895: in VariableManager get_vars() 12180 1727204067.35914: Calling all_inventory to load vars for managed-node1 12180 1727204067.35917: Calling groups_inventory to load vars for managed-node1 12180 1727204067.35919: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204067.35925: Calling all_plugins_play to load vars for managed-node1 12180 1727204067.35927: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204067.35930: Calling groups_plugins_play to load vars for managed-node1 12180 1727204067.36077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204067.37280: done with get_vars() 12180 1727204067.37293: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.101) 0:00:14.785 ***** 12180 1727204067.37375: entering _queue_task() for managed-node1/setup 12180 1727204067.38075: worker is 1 (out of 1 available) 12180 1727204067.38087: exiting _queue_task() for managed-node1/setup 12180 1727204067.38098: done queuing things up, now waiting for results queue to drain 12180 1727204067.38099: waiting for pending results... 12180 1727204067.38948: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12180 1727204067.39098: in run() - task 0affcd87-79f5-ccb1-55ae-000000000189 12180 1727204067.39240: variable 'ansible_search_path' from source: unknown 12180 1727204067.39249: variable 'ansible_search_path' from source: unknown 12180 1727204067.39293: calling self._execute() 12180 1727204067.39480: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204067.39557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204067.39574: variable 'omit' from source: magic vars 12180 1727204067.40232: variable 'ansible_distribution_major_version' from source: facts 12180 1727204067.40324: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204067.40849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204067.47609: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204067.47735: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204067.48999: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204067.49042: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204067.49312: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204067.49624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204067.49660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204067.49695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204067.49746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204067.49951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204067.50012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204067.50178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204067.50289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204067.50335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204067.50489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204067.50968: variable '__network_required_facts' from source: role '' defaults 12180 1727204067.50983: variable 'ansible_facts' from source: unknown 12180 1727204067.51296: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12180 1727204067.51304: when evaluation is False, skipping this task 12180 1727204067.51311: _execute() done 12180 1727204067.51317: dumping result to json 12180 1727204067.51324: done dumping result, returning 12180 1727204067.51341: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-ccb1-55ae-000000000189] 12180 1727204067.51455: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000189 skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12180 1727204067.51704: no more pending results, returning what we have 12180 1727204067.51709: results queue empty 12180 1727204067.51710: checking for any_errors_fatal 12180 1727204067.51711: done checking for any_errors_fatal 12180 1727204067.51712: checking for max_fail_percentage 12180 1727204067.51714: done checking for max_fail_percentage 12180 1727204067.51715: checking to see if all hosts have failed and the running result is not ok 12180 1727204067.51716: done checking to see if all hosts have failed 12180 1727204067.51717: getting the remaining hosts for this loop 12180 1727204067.51718: done getting the remaining hosts for this loop 12180 1727204067.51723: getting the next task for host managed-node1 12180 1727204067.51733: done getting next task for host managed-node1 12180 1727204067.51737: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12180 1727204067.51741: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204067.51755: getting variables 12180 1727204067.51757: in VariableManager get_vars() 12180 1727204067.51803: Calling all_inventory to load vars for managed-node1 12180 1727204067.51806: Calling groups_inventory to load vars for managed-node1 12180 1727204067.51809: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204067.51821: Calling all_plugins_play to load vars for managed-node1 12180 1727204067.51823: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204067.51826: Calling groups_plugins_play to load vars for managed-node1 12180 1727204067.52030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204067.52240: done with get_vars() 12180 1727204067.52253: done getting variables 12180 1727204067.52568: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000189 12180 1727204067.52572: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.152) 0:00:14.938 ***** 12180 1727204067.52653: entering _queue_task() for managed-node1/stat 12180 1727204067.52906: worker is 1 (out of 1 available) 12180 1727204067.52917: exiting _queue_task() for managed-node1/stat 12180 1727204067.52930: done queuing things up, now waiting for results queue to drain 12180 1727204067.52932: waiting for pending results... 12180 1727204067.53916: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 12180 1727204067.54727: in run() - task 0affcd87-79f5-ccb1-55ae-00000000018b 12180 1727204067.54809: variable 'ansible_search_path' from source: unknown 12180 1727204067.54906: variable 'ansible_search_path' from source: unknown 12180 1727204067.54953: calling self._execute() 12180 1727204067.55088: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204067.55344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204067.55360: variable 'omit' from source: magic vars 12180 1727204067.56398: variable 'ansible_distribution_major_version' from source: facts 12180 1727204067.56550: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204067.57035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204067.57656: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204067.57709: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204067.57752: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204067.57866: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204067.58071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204067.58103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204067.58133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204067.58189: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204067.58361: variable '__network_is_ostree' from source: set_fact 12180 1727204067.58492: Evaluated conditional (not __network_is_ostree is defined): False 12180 1727204067.58500: when evaluation is False, skipping this task 12180 1727204067.58507: _execute() done 12180 1727204067.58514: dumping result to json 12180 1727204067.58521: done dumping result, returning 12180 1727204067.58533: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-ccb1-55ae-00000000018b] 12180 1727204067.58543: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000018b skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12180 1727204067.58692: no more pending results, returning what we have 12180 1727204067.58696: results queue empty 12180 1727204067.58697: checking for any_errors_fatal 12180 1727204067.58703: done checking for any_errors_fatal 12180 1727204067.58704: checking for max_fail_percentage 12180 1727204067.58706: done checking for max_fail_percentage 12180 1727204067.58707: checking to see if all hosts have failed and the running result is not ok 12180 1727204067.58708: done checking to see if all hosts have failed 12180 1727204067.58708: getting the remaining hosts for this loop 12180 1727204067.58710: done getting the remaining hosts for this loop 12180 1727204067.58714: getting the next task for host managed-node1 12180 1727204067.58721: done getting next task for host managed-node1 12180 1727204067.58725: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12180 1727204067.58730: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204067.58744: getting variables 12180 1727204067.58745: in VariableManager get_vars() 12180 1727204067.58790: Calling all_inventory to load vars for managed-node1 12180 1727204067.58793: Calling groups_inventory to load vars for managed-node1 12180 1727204067.58795: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204067.58807: Calling all_plugins_play to load vars for managed-node1 12180 1727204067.58809: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204067.58812: Calling groups_plugins_play to load vars for managed-node1 12180 1727204067.58988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204067.59192: done with get_vars() 12180 1727204067.59203: done getting variables 12180 1727204067.59261: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204067.60060: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000018b 12180 1727204067.60066: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.074) 0:00:15.012 ***** 12180 1727204067.60080: entering _queue_task() for managed-node1/set_fact 12180 1727204067.60327: worker is 1 (out of 1 available) 12180 1727204067.60339: exiting _queue_task() for managed-node1/set_fact 12180 1727204067.60351: done queuing things up, now waiting for results queue to drain 12180 1727204067.60353: waiting for pending results... 12180 1727204067.61049: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12180 1727204067.61308: in run() - task 0affcd87-79f5-ccb1-55ae-00000000018c 12180 1727204067.61358: variable 'ansible_search_path' from source: unknown 12180 1727204067.61369: variable 'ansible_search_path' from source: unknown 12180 1727204067.61408: calling self._execute() 12180 1727204067.61525: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204067.61680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204067.61695: variable 'omit' from source: magic vars 12180 1727204067.62390: variable 'ansible_distribution_major_version' from source: facts 12180 1727204067.62412: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204067.62720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204067.63550: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204067.63605: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204067.63669: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204067.63777: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204067.63980: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204067.64011: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204067.64042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204067.64199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204067.64412: variable '__network_is_ostree' from source: set_fact 12180 1727204067.64426: Evaluated conditional (not __network_is_ostree is defined): False 12180 1727204067.64433: when evaluation is False, skipping this task 12180 1727204067.64441: _execute() done 12180 1727204067.64449: dumping result to json 12180 1727204067.64457: done dumping result, returning 12180 1727204067.64472: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-ccb1-55ae-00000000018c] 12180 1727204067.64484: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000018c 12180 1727204067.64596: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000018c skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12180 1727204067.64649: no more pending results, returning what we have 12180 1727204067.64653: results queue empty 12180 1727204067.64654: checking for any_errors_fatal 12180 1727204067.64659: done checking for any_errors_fatal 12180 1727204067.64660: checking for max_fail_percentage 12180 1727204067.64662: done checking for max_fail_percentage 12180 1727204067.64665: checking to see if all hosts have failed and the running result is not ok 12180 1727204067.64666: done checking to see if all hosts have failed 12180 1727204067.64667: getting the remaining hosts for this loop 12180 1727204067.64669: done getting the remaining hosts for this loop 12180 1727204067.64673: getting the next task for host managed-node1 12180 1727204067.64684: done getting next task for host managed-node1 12180 1727204067.64689: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12180 1727204067.64694: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204067.64708: getting variables 12180 1727204067.64713: in VariableManager get_vars() 12180 1727204067.64759: Calling all_inventory to load vars for managed-node1 12180 1727204067.64762: Calling groups_inventory to load vars for managed-node1 12180 1727204067.64766: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204067.64778: Calling all_plugins_play to load vars for managed-node1 12180 1727204067.64780: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204067.64783: Calling groups_plugins_play to load vars for managed-node1 12180 1727204067.65000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204067.65206: done with get_vars() 12180 1727204067.65218: done getting variables 12180 1727204067.66030: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:54:27 -0400 (0:00:00.060) 0:00:15.072 ***** 12180 1727204067.66096: entering _queue_task() for managed-node1/service_facts 12180 1727204067.66098: Creating lock for service_facts 12180 1727204067.66362: worker is 1 (out of 1 available) 12180 1727204067.66376: exiting _queue_task() for managed-node1/service_facts 12180 1727204067.66386: done queuing things up, now waiting for results queue to drain 12180 1727204067.66388: waiting for pending results... 12180 1727204067.67140: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 12180 1727204067.67463: in run() - task 0affcd87-79f5-ccb1-55ae-00000000018e 12180 1727204067.67486: variable 'ansible_search_path' from source: unknown 12180 1727204067.67539: variable 'ansible_search_path' from source: unknown 12180 1727204067.67583: calling self._execute() 12180 1727204067.67720: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204067.67871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204067.67886: variable 'omit' from source: magic vars 12180 1727204067.68462: variable 'ansible_distribution_major_version' from source: facts 12180 1727204067.68640: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204067.68652: variable 'omit' from source: magic vars 12180 1727204067.68728: variable 'omit' from source: magic vars 12180 1727204067.68772: variable 'omit' from source: magic vars 12180 1727204067.68886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204067.68991: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204067.69015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204067.69082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204067.69099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204067.69133: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204067.69178: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204067.69287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204067.69501: Set connection var ansible_pipelining to False 12180 1727204067.69510: Set connection var ansible_shell_type to sh 12180 1727204067.69521: Set connection var ansible_timeout to 10 12180 1727204067.69530: Set connection var ansible_connection to ssh 12180 1727204067.69541: Set connection var ansible_shell_executable to /bin/sh 12180 1727204067.69550: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204067.69584: variable 'ansible_shell_executable' from source: unknown 12180 1727204067.69592: variable 'ansible_connection' from source: unknown 12180 1727204067.69599: variable 'ansible_module_compression' from source: unknown 12180 1727204067.69610: variable 'ansible_shell_type' from source: unknown 12180 1727204067.69616: variable 'ansible_shell_executable' from source: unknown 12180 1727204067.69621: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204067.69628: variable 'ansible_pipelining' from source: unknown 12180 1727204067.69634: variable 'ansible_timeout' from source: unknown 12180 1727204067.69641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204067.70057: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204067.70074: variable 'omit' from source: magic vars 12180 1727204067.70084: starting attempt loop 12180 1727204067.70091: running the handler 12180 1727204067.70107: _low_level_execute_command(): starting 12180 1727204067.70161: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204067.71993: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204067.72130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204067.72147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204067.72169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204067.72214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204067.72230: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204067.72245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204067.72267: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204067.72280: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204067.72291: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204067.72303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204067.72316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204067.72335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204067.72347: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204067.72359: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204067.72377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204067.72570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204067.72596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204067.72614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204067.72707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204067.74334: stdout chunk (state=3): >>>/root <<< 12180 1727204067.74535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204067.74539: stdout chunk (state=3): >>><<< 12180 1727204067.74542: stderr chunk (state=3): >>><<< 12180 1727204067.74661: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204067.74666: _low_level_execute_command(): starting 12180 1727204067.74670: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204067.7456632-13981-228007359367986 `" && echo ansible-tmp-1727204067.7456632-13981-228007359367986="` echo /root/.ansible/tmp/ansible-tmp-1727204067.7456632-13981-228007359367986 `" ) && sleep 0' 12180 1727204067.76134: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204067.76150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204067.76168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204067.76192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204067.76236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204067.76298: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204067.76314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204067.76331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204067.76343: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204067.76353: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204067.76368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204067.76382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204067.76401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204067.76520: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204067.76533: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204067.76548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204067.76629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204067.76652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204067.76672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204067.76761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204067.78620: stdout chunk (state=3): >>>ansible-tmp-1727204067.7456632-13981-228007359367986=/root/.ansible/tmp/ansible-tmp-1727204067.7456632-13981-228007359367986 <<< 12180 1727204067.78825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204067.78829: stdout chunk (state=3): >>><<< 12180 1727204067.78832: stderr chunk (state=3): >>><<< 12180 1727204067.78871: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204067.7456632-13981-228007359367986=/root/.ansible/tmp/ansible-tmp-1727204067.7456632-13981-228007359367986 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204067.79171: variable 'ansible_module_compression' from source: unknown 12180 1727204067.79174: ANSIBALLZ: Using lock for service_facts 12180 1727204067.79176: ANSIBALLZ: Acquiring lock 12180 1727204067.79178: ANSIBALLZ: Lock acquired: 140650305012512 12180 1727204067.79180: ANSIBALLZ: Creating module 12180 1727204068.06960: ANSIBALLZ: Writing module into payload 12180 1727204068.07106: ANSIBALLZ: Writing module 12180 1727204068.07139: ANSIBALLZ: Renaming module 12180 1727204068.07149: ANSIBALLZ: Done creating module 12180 1727204068.07172: variable 'ansible_facts' from source: unknown 12180 1727204068.07262: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204067.7456632-13981-228007359367986/AnsiballZ_service_facts.py 12180 1727204068.07435: Sending initial data 12180 1727204068.07438: Sent initial data (162 bytes) 12180 1727204068.08503: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204068.08521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204068.08545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204068.08568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204068.08611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204068.08627: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204068.08654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204068.08676: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204068.08690: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204068.08706: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204068.08728: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204068.08746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204068.08766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204068.08780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204068.08792: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204068.08810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204068.08913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204068.08939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204068.08963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204068.09390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204068.11221: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204068.11270: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204068.11323: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpk9ne3ljd /root/.ansible/tmp/ansible-tmp-1727204067.7456632-13981-228007359367986/AnsiballZ_service_facts.py <<< 12180 1727204068.11373: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204068.12947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204068.13088: stderr chunk (state=3): >>><<< 12180 1727204068.13092: stdout chunk (state=3): >>><<< 12180 1727204068.13094: done transferring module to remote 12180 1727204068.13097: _low_level_execute_command(): starting 12180 1727204068.13099: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204067.7456632-13981-228007359367986/ /root/.ansible/tmp/ansible-tmp-1727204067.7456632-13981-228007359367986/AnsiballZ_service_facts.py && sleep 0' 12180 1727204068.13878: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204068.13882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204068.13909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204068.13914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204068.13916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204068.14002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204068.14006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204068.14095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204068.15982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204068.16338: stderr chunk (state=3): >>><<< 12180 1727204068.16350: stdout chunk (state=3): >>><<< 12180 1727204068.16448: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204068.16452: _low_level_execute_command(): starting 12180 1727204068.16454: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204067.7456632-13981-228007359367986/AnsiballZ_service_facts.py && sleep 0' 12180 1727204068.19471: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204068.19490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204068.19504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204068.19526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204068.19594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204068.19624: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204068.19646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204068.19662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204068.19675: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204068.19686: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204068.19700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204068.19714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204068.19731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204068.19747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204068.19757: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204068.19770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204068.19892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204068.19923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204068.19988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204068.20127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204069.49426: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 12180 1727204069.49444: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hi<<< 12180 1727204069.49483: stdout chunk (state=3): >>>bernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12180 1727204069.50720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204069.50724: stderr chunk (state=3): >>><<< 12180 1727204069.50726: stdout chunk (state=3): >>><<< 12180 1727204069.50761: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204069.52951: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204067.7456632-13981-228007359367986/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204069.53026: _low_level_execute_command(): starting 12180 1727204069.53033: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204067.7456632-13981-228007359367986/ > /dev/null 2>&1 && sleep 0' 12180 1727204069.53813: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204069.53821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204069.53838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204069.53859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204069.53910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204069.53916: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204069.53930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204069.53946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204069.53961: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204069.53970: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204069.53985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204069.53988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204069.54031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204069.54035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204069.54037: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204069.54039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204069.54118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204069.54136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204069.54140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204069.54235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204069.56071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204069.56075: stdout chunk (state=3): >>><<< 12180 1727204069.56081: stderr chunk (state=3): >>><<< 12180 1727204069.56110: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204069.56114: handler run complete 12180 1727204069.56310: variable 'ansible_facts' from source: unknown 12180 1727204069.56452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204069.57115: variable 'ansible_facts' from source: unknown 12180 1727204069.57234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204069.57456: attempt loop complete, returning result 12180 1727204069.57459: _execute() done 12180 1727204069.57462: dumping result to json 12180 1727204069.57527: done dumping result, returning 12180 1727204069.57536: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-ccb1-55ae-00000000018e] 12180 1727204069.57542: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000018e ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12180 1727204069.58598: no more pending results, returning what we have 12180 1727204069.58601: results queue empty 12180 1727204069.58602: checking for any_errors_fatal 12180 1727204069.58604: done checking for any_errors_fatal 12180 1727204069.58605: checking for max_fail_percentage 12180 1727204069.58606: done checking for max_fail_percentage 12180 1727204069.58607: checking to see if all hosts have failed and the running result is not ok 12180 1727204069.58608: done checking to see if all hosts have failed 12180 1727204069.58609: getting the remaining hosts for this loop 12180 1727204069.58610: done getting the remaining hosts for this loop 12180 1727204069.58613: getting the next task for host managed-node1 12180 1727204069.58618: done getting next task for host managed-node1 12180 1727204069.58621: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12180 1727204069.58625: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204069.58635: getting variables 12180 1727204069.58637: in VariableManager get_vars() 12180 1727204069.58669: Calling all_inventory to load vars for managed-node1 12180 1727204069.58672: Calling groups_inventory to load vars for managed-node1 12180 1727204069.58675: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204069.58683: Calling all_plugins_play to load vars for managed-node1 12180 1727204069.58685: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204069.58688: Calling groups_plugins_play to load vars for managed-node1 12180 1727204069.59019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204069.59510: done with get_vars() 12180 1727204069.59523: done getting variables 12180 1727204069.59559: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000018e 12180 1727204069.59562: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:54:29 -0400 (0:00:01.935) 0:00:17.008 ***** 12180 1727204069.59643: entering _queue_task() for managed-node1/package_facts 12180 1727204069.59645: Creating lock for package_facts 12180 1727204069.59951: worker is 1 (out of 1 available) 12180 1727204069.59965: exiting _queue_task() for managed-node1/package_facts 12180 1727204069.59976: done queuing things up, now waiting for results queue to drain 12180 1727204069.59978: waiting for pending results... 12180 1727204069.60947: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 12180 1727204069.61293: in run() - task 0affcd87-79f5-ccb1-55ae-00000000018f 12180 1727204069.61324: variable 'ansible_search_path' from source: unknown 12180 1727204069.61348: variable 'ansible_search_path' from source: unknown 12180 1727204069.61395: calling self._execute() 12180 1727204069.61501: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204069.61513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204069.61531: variable 'omit' from source: magic vars 12180 1727204069.61932: variable 'ansible_distribution_major_version' from source: facts 12180 1727204069.61952: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204069.61966: variable 'omit' from source: magic vars 12180 1727204069.62051: variable 'omit' from source: magic vars 12180 1727204069.62098: variable 'omit' from source: magic vars 12180 1727204069.62150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204069.62196: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204069.62230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204069.62254: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204069.62274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204069.62312: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204069.62322: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204069.62333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204069.62446: Set connection var ansible_pipelining to False 12180 1727204069.62455: Set connection var ansible_shell_type to sh 12180 1727204069.62469: Set connection var ansible_timeout to 10 12180 1727204069.62481: Set connection var ansible_connection to ssh 12180 1727204069.62492: Set connection var ansible_shell_executable to /bin/sh 12180 1727204069.62503: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204069.62543: variable 'ansible_shell_executable' from source: unknown 12180 1727204069.62551: variable 'ansible_connection' from source: unknown 12180 1727204069.62559: variable 'ansible_module_compression' from source: unknown 12180 1727204069.62567: variable 'ansible_shell_type' from source: unknown 12180 1727204069.62575: variable 'ansible_shell_executable' from source: unknown 12180 1727204069.62582: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204069.62589: variable 'ansible_pipelining' from source: unknown 12180 1727204069.62596: variable 'ansible_timeout' from source: unknown 12180 1727204069.62603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204069.62820: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204069.62839: variable 'omit' from source: magic vars 12180 1727204069.62850: starting attempt loop 12180 1727204069.62858: running the handler 12180 1727204069.62882: _low_level_execute_command(): starting 12180 1727204069.62896: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204069.64345: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204069.64366: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204069.64382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204069.64404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204069.64455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204069.64472: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204069.64492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204069.64514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204069.64531: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204069.64545: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204069.64558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204069.64577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204069.64595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204069.64607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204069.64618: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204069.64640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204069.64718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204069.64751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204069.64772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204069.64861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204069.66409: stdout chunk (state=3): >>>/root <<< 12180 1727204069.66617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204069.66620: stdout chunk (state=3): >>><<< 12180 1727204069.66622: stderr chunk (state=3): >>><<< 12180 1727204069.66671: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204069.66675: _low_level_execute_command(): starting 12180 1727204069.66678: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204069.6664667-14046-258455254697772 `" && echo ansible-tmp-1727204069.6664667-14046-258455254697772="` echo /root/.ansible/tmp/ansible-tmp-1727204069.6664667-14046-258455254697772 `" ) && sleep 0' 12180 1727204069.68434: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204069.68453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204069.68472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204069.68492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204069.68541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204069.68554: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204069.68571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204069.68589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204069.68601: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204069.68614: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204069.68626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204069.68642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204069.68656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204069.68670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204069.68681: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204069.68694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204069.68778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204069.68800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204069.68817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204069.68909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204069.70745: stdout chunk (state=3): >>>ansible-tmp-1727204069.6664667-14046-258455254697772=/root/.ansible/tmp/ansible-tmp-1727204069.6664667-14046-258455254697772 <<< 12180 1727204069.70881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204069.70975: stderr chunk (state=3): >>><<< 12180 1727204069.70978: stdout chunk (state=3): >>><<< 12180 1727204069.71078: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204069.6664667-14046-258455254697772=/root/.ansible/tmp/ansible-tmp-1727204069.6664667-14046-258455254697772 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204069.71083: variable 'ansible_module_compression' from source: unknown 12180 1727204069.71169: ANSIBALLZ: Using lock for package_facts 12180 1727204069.71172: ANSIBALLZ: Acquiring lock 12180 1727204069.71174: ANSIBALLZ: Lock acquired: 140650302755024 12180 1727204069.71178: ANSIBALLZ: Creating module 12180 1727204070.18382: ANSIBALLZ: Writing module into payload 12180 1727204070.18609: ANSIBALLZ: Writing module 12180 1727204070.18660: ANSIBALLZ: Renaming module 12180 1727204070.18681: ANSIBALLZ: Done creating module 12180 1727204070.18737: variable 'ansible_facts' from source: unknown 12180 1727204070.18973: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204069.6664667-14046-258455254697772/AnsiballZ_package_facts.py 12180 1727204070.19179: Sending initial data 12180 1727204070.19183: Sent initial data (162 bytes) 12180 1727204070.20351: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204070.20379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204070.20408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204070.20433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204070.20497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204070.20520: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204070.20545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204070.20570: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204070.20586: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204070.20600: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204070.20615: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204070.20634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204070.20651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204070.20666: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204070.20679: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204070.20696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204070.20774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204070.20789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204070.20888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204070.22623: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204070.22672: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204070.22723: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpdfm8bzdd /root/.ansible/tmp/ansible-tmp-1727204069.6664667-14046-258455254697772/AnsiballZ_package_facts.py <<< 12180 1727204070.22773: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204070.25554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204070.25571: stdout chunk (state=3): >>><<< 12180 1727204070.25575: stderr chunk (state=3): >>><<< 12180 1727204070.25578: done transferring module to remote 12180 1727204070.25580: _low_level_execute_command(): starting 12180 1727204070.25582: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204069.6664667-14046-258455254697772/ /root/.ansible/tmp/ansible-tmp-1727204069.6664667-14046-258455254697772/AnsiballZ_package_facts.py && sleep 0' 12180 1727204070.26504: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204070.26530: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204070.26551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204070.26586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204070.26643: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204070.26655: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204070.26671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204070.26690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204070.26704: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204070.26715: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204070.26726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204070.26742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204070.26757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204070.26771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204070.26782: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204070.26795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204070.26874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204070.26897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204070.26926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204070.27042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204070.28751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204070.28928: stderr chunk (state=3): >>><<< 12180 1727204070.28947: stdout chunk (state=3): >>><<< 12180 1727204070.29105: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204070.29111: _low_level_execute_command(): starting 12180 1727204070.29114: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204069.6664667-14046-258455254697772/AnsiballZ_package_facts.py && sleep 0' 12180 1727204070.29993: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204070.30011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204070.30026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204070.30044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204070.30105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204070.30136: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204070.30163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204070.30199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204070.30239: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204070.30269: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204070.30289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204070.30303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204070.30321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204070.30338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204070.30349: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204070.30362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204070.30447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204070.30472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204070.30487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204070.30581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204070.76582: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": <<< 12180 1727204070.76602: stdout chunk (state=3): >>>"53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 12180 1727204070.76606: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 12180 1727204070.76612: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "rel<<< 12180 1727204070.76615: stdout chunk (state=3): >>>ease": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 12180 1727204070.76755: stdout chunk (state=3): >>>libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 12180 1727204070.76773: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12180 1727204070.78240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204070.78323: stderr chunk (state=3): >>><<< 12180 1727204070.78327: stdout chunk (state=3): >>><<< 12180 1727204070.78375: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204070.81765: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204069.6664667-14046-258455254697772/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204070.81792: _low_level_execute_command(): starting 12180 1727204070.81796: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204069.6664667-14046-258455254697772/ > /dev/null 2>&1 && sleep 0' 12180 1727204070.82669: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204070.82672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204070.82675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204070.82677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204070.82679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204070.82681: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204070.82683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204070.82684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204070.82686: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204070.82688: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204070.82690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204070.82692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204070.82694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204070.82696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204070.82697: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204070.82699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204070.82701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204070.82703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204070.82705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204070.82841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204070.84684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204070.84761: stderr chunk (state=3): >>><<< 12180 1727204070.84776: stdout chunk (state=3): >>><<< 12180 1727204070.84873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204070.84880: handler run complete 12180 1727204070.85972: variable 'ansible_facts' from source: unknown 12180 1727204070.86637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204070.89445: variable 'ansible_facts' from source: unknown 12180 1727204070.90017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204070.91106: attempt loop complete, returning result 12180 1727204070.91126: _execute() done 12180 1727204070.91138: dumping result to json 12180 1727204070.91555: done dumping result, returning 12180 1727204070.91566: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-ccb1-55ae-00000000018f] 12180 1727204070.91573: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000018f ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12180 1727204070.95396: no more pending results, returning what we have 12180 1727204070.95398: results queue empty 12180 1727204070.95401: checking for any_errors_fatal 12180 1727204070.95406: done checking for any_errors_fatal 12180 1727204070.95407: checking for max_fail_percentage 12180 1727204070.95408: done checking for max_fail_percentage 12180 1727204070.95409: checking to see if all hosts have failed and the running result is not ok 12180 1727204070.95410: done checking to see if all hosts have failed 12180 1727204070.95411: getting the remaining hosts for this loop 12180 1727204070.95412: done getting the remaining hosts for this loop 12180 1727204070.95415: getting the next task for host managed-node1 12180 1727204070.95421: done getting next task for host managed-node1 12180 1727204070.95424: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12180 1727204070.95427: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204070.95439: getting variables 12180 1727204070.95440: in VariableManager get_vars() 12180 1727204070.95474: Calling all_inventory to load vars for managed-node1 12180 1727204070.95477: Calling groups_inventory to load vars for managed-node1 12180 1727204070.95479: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204070.95488: Calling all_plugins_play to load vars for managed-node1 12180 1727204070.95490: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204070.95492: Calling groups_plugins_play to load vars for managed-node1 12180 1727204070.96518: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000018f 12180 1727204070.96523: WORKER PROCESS EXITING 12180 1727204070.97157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204070.98579: done with get_vars() 12180 1727204070.98601: done getting variables 12180 1727204070.98648: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:30 -0400 (0:00:01.390) 0:00:18.398 ***** 12180 1727204070.98674: entering _queue_task() for managed-node1/debug 12180 1727204070.98888: worker is 1 (out of 1 available) 12180 1727204070.98901: exiting _queue_task() for managed-node1/debug 12180 1727204070.98912: done queuing things up, now waiting for results queue to drain 12180 1727204070.98914: waiting for pending results... 12180 1727204070.99099: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 12180 1727204070.99185: in run() - task 0affcd87-79f5-ccb1-55ae-000000000027 12180 1727204070.99195: variable 'ansible_search_path' from source: unknown 12180 1727204070.99199: variable 'ansible_search_path' from source: unknown 12180 1727204070.99228: calling self._execute() 12180 1727204070.99294: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204070.99298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204070.99307: variable 'omit' from source: magic vars 12180 1727204070.99603: variable 'ansible_distribution_major_version' from source: facts 12180 1727204070.99619: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204070.99625: variable 'omit' from source: magic vars 12180 1727204070.99920: variable 'omit' from source: magic vars 12180 1727204070.99923: variable 'network_provider' from source: set_fact 12180 1727204070.99926: variable 'omit' from source: magic vars 12180 1727204070.99931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204070.99934: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204070.99936: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204070.99948: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204070.99959: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204070.99991: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204070.99994: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204070.99997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204071.00110: Set connection var ansible_pipelining to False 12180 1727204071.00117: Set connection var ansible_shell_type to sh 12180 1727204071.00124: Set connection var ansible_timeout to 10 12180 1727204071.00133: Set connection var ansible_connection to ssh 12180 1727204071.00136: Set connection var ansible_shell_executable to /bin/sh 12180 1727204071.00141: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204071.00176: variable 'ansible_shell_executable' from source: unknown 12180 1727204071.00179: variable 'ansible_connection' from source: unknown 12180 1727204071.00182: variable 'ansible_module_compression' from source: unknown 12180 1727204071.00185: variable 'ansible_shell_type' from source: unknown 12180 1727204071.00187: variable 'ansible_shell_executable' from source: unknown 12180 1727204071.00189: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204071.00192: variable 'ansible_pipelining' from source: unknown 12180 1727204071.00194: variable 'ansible_timeout' from source: unknown 12180 1727204071.00198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204071.00346: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204071.00359: variable 'omit' from source: magic vars 12180 1727204071.00371: starting attempt loop 12180 1727204071.00374: running the handler 12180 1727204071.00417: handler run complete 12180 1727204071.00433: attempt loop complete, returning result 12180 1727204071.00436: _execute() done 12180 1727204071.00439: dumping result to json 12180 1727204071.00443: done dumping result, returning 12180 1727204071.00451: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-ccb1-55ae-000000000027] 12180 1727204071.00457: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000027 12180 1727204071.00550: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000027 12180 1727204071.00553: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 12180 1727204071.00639: no more pending results, returning what we have 12180 1727204071.00642: results queue empty 12180 1727204071.00643: checking for any_errors_fatal 12180 1727204071.00652: done checking for any_errors_fatal 12180 1727204071.00653: checking for max_fail_percentage 12180 1727204071.00655: done checking for max_fail_percentage 12180 1727204071.00656: checking to see if all hosts have failed and the running result is not ok 12180 1727204071.00657: done checking to see if all hosts have failed 12180 1727204071.00657: getting the remaining hosts for this loop 12180 1727204071.00659: done getting the remaining hosts for this loop 12180 1727204071.00663: getting the next task for host managed-node1 12180 1727204071.00671: done getting next task for host managed-node1 12180 1727204071.00674: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12180 1727204071.00677: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204071.00687: getting variables 12180 1727204071.00689: in VariableManager get_vars() 12180 1727204071.00726: Calling all_inventory to load vars for managed-node1 12180 1727204071.00730: Calling groups_inventory to load vars for managed-node1 12180 1727204071.00733: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204071.00741: Calling all_plugins_play to load vars for managed-node1 12180 1727204071.00743: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204071.00745: Calling groups_plugins_play to load vars for managed-node1 12180 1727204071.01795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204071.02741: done with get_vars() 12180 1727204071.02762: done getting variables 12180 1727204071.02833: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.041) 0:00:18.440 ***** 12180 1727204071.02860: entering _queue_task() for managed-node1/fail 12180 1727204071.02861: Creating lock for fail 12180 1727204071.03092: worker is 1 (out of 1 available) 12180 1727204071.03108: exiting _queue_task() for managed-node1/fail 12180 1727204071.03151: done queuing things up, now waiting for results queue to drain 12180 1727204071.03153: waiting for pending results... 12180 1727204071.03371: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12180 1727204071.03676: in run() - task 0affcd87-79f5-ccb1-55ae-000000000028 12180 1727204071.03680: variable 'ansible_search_path' from source: unknown 12180 1727204071.03683: variable 'ansible_search_path' from source: unknown 12180 1727204071.03686: calling self._execute() 12180 1727204071.03689: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204071.03691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204071.03694: variable 'omit' from source: magic vars 12180 1727204071.04059: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.04072: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204071.04195: variable 'network_state' from source: role '' defaults 12180 1727204071.04204: Evaluated conditional (network_state != {}): False 12180 1727204071.04207: when evaluation is False, skipping this task 12180 1727204071.04210: _execute() done 12180 1727204071.04215: dumping result to json 12180 1727204071.04217: done dumping result, returning 12180 1727204071.04220: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-ccb1-55ae-000000000028] 12180 1727204071.04233: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000028 12180 1727204071.04325: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000028 12180 1727204071.04327: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12180 1727204071.04383: no more pending results, returning what we have 12180 1727204071.04387: results queue empty 12180 1727204071.04388: checking for any_errors_fatal 12180 1727204071.04395: done checking for any_errors_fatal 12180 1727204071.04395: checking for max_fail_percentage 12180 1727204071.04397: done checking for max_fail_percentage 12180 1727204071.04398: checking to see if all hosts have failed and the running result is not ok 12180 1727204071.04399: done checking to see if all hosts have failed 12180 1727204071.04399: getting the remaining hosts for this loop 12180 1727204071.04401: done getting the remaining hosts for this loop 12180 1727204071.04404: getting the next task for host managed-node1 12180 1727204071.04412: done getting next task for host managed-node1 12180 1727204071.04417: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12180 1727204071.04420: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204071.04437: getting variables 12180 1727204071.04439: in VariableManager get_vars() 12180 1727204071.04484: Calling all_inventory to load vars for managed-node1 12180 1727204071.04487: Calling groups_inventory to load vars for managed-node1 12180 1727204071.04489: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204071.04498: Calling all_plugins_play to load vars for managed-node1 12180 1727204071.04500: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204071.04502: Calling groups_plugins_play to load vars for managed-node1 12180 1727204071.05900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204071.06834: done with get_vars() 12180 1727204071.06850: done getting variables 12180 1727204071.06897: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.040) 0:00:18.481 ***** 12180 1727204071.06921: entering _queue_task() for managed-node1/fail 12180 1727204071.07144: worker is 1 (out of 1 available) 12180 1727204071.07157: exiting _queue_task() for managed-node1/fail 12180 1727204071.07170: done queuing things up, now waiting for results queue to drain 12180 1727204071.07172: waiting for pending results... 12180 1727204071.07396: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12180 1727204071.07540: in run() - task 0affcd87-79f5-ccb1-55ae-000000000029 12180 1727204071.07568: variable 'ansible_search_path' from source: unknown 12180 1727204071.07577: variable 'ansible_search_path' from source: unknown 12180 1727204071.07615: calling self._execute() 12180 1727204071.07718: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204071.07732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204071.07754: variable 'omit' from source: magic vars 12180 1727204071.08163: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.08196: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204071.08342: variable 'network_state' from source: role '' defaults 12180 1727204071.08355: Evaluated conditional (network_state != {}): False 12180 1727204071.08362: when evaluation is False, skipping this task 12180 1727204071.08371: _execute() done 12180 1727204071.08377: dumping result to json 12180 1727204071.08385: done dumping result, returning 12180 1727204071.08404: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-ccb1-55ae-000000000029] 12180 1727204071.08423: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000029 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12180 1727204071.08585: no more pending results, returning what we have 12180 1727204071.08589: results queue empty 12180 1727204071.08590: checking for any_errors_fatal 12180 1727204071.08597: done checking for any_errors_fatal 12180 1727204071.08598: checking for max_fail_percentage 12180 1727204071.08600: done checking for max_fail_percentage 12180 1727204071.08601: checking to see if all hosts have failed and the running result is not ok 12180 1727204071.08601: done checking to see if all hosts have failed 12180 1727204071.08602: getting the remaining hosts for this loop 12180 1727204071.08604: done getting the remaining hosts for this loop 12180 1727204071.08607: getting the next task for host managed-node1 12180 1727204071.08616: done getting next task for host managed-node1 12180 1727204071.08620: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12180 1727204071.08622: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204071.08642: getting variables 12180 1727204071.08643: in VariableManager get_vars() 12180 1727204071.08691: Calling all_inventory to load vars for managed-node1 12180 1727204071.08694: Calling groups_inventory to load vars for managed-node1 12180 1727204071.08696: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204071.08707: Calling all_plugins_play to load vars for managed-node1 12180 1727204071.08709: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204071.08711: Calling groups_plugins_play to load vars for managed-node1 12180 1727204071.09379: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000029 12180 1727204071.09383: WORKER PROCESS EXITING 12180 1727204071.10218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204071.11339: done with get_vars() 12180 1727204071.11356: done getting variables 12180 1727204071.11402: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.045) 0:00:18.526 ***** 12180 1727204071.11430: entering _queue_task() for managed-node1/fail 12180 1727204071.11671: worker is 1 (out of 1 available) 12180 1727204071.11685: exiting _queue_task() for managed-node1/fail 12180 1727204071.11697: done queuing things up, now waiting for results queue to drain 12180 1727204071.11699: waiting for pending results... 12180 1727204071.11886: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12180 1727204071.12033: in run() - task 0affcd87-79f5-ccb1-55ae-00000000002a 12180 1727204071.12061: variable 'ansible_search_path' from source: unknown 12180 1727204071.12079: variable 'ansible_search_path' from source: unknown 12180 1727204071.12130: calling self._execute() 12180 1727204071.12241: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204071.12253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204071.12271: variable 'omit' from source: magic vars 12180 1727204071.12711: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.12743: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204071.12957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204071.15684: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204071.15756: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204071.15796: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204071.15833: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204071.15863: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204071.15990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.16032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.16076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.16125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.16245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.16381: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.16403: Evaluated conditional (ansible_distribution_major_version | int > 9): False 12180 1727204071.16480: when evaluation is False, skipping this task 12180 1727204071.16496: _execute() done 12180 1727204071.16504: dumping result to json 12180 1727204071.16512: done dumping result, returning 12180 1727204071.16531: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-ccb1-55ae-00000000002a] 12180 1727204071.16544: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000002a skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 12180 1727204071.16749: no more pending results, returning what we have 12180 1727204071.16753: results queue empty 12180 1727204071.16754: checking for any_errors_fatal 12180 1727204071.16761: done checking for any_errors_fatal 12180 1727204071.16762: checking for max_fail_percentage 12180 1727204071.16770: done checking for max_fail_percentage 12180 1727204071.16771: checking to see if all hosts have failed and the running result is not ok 12180 1727204071.16772: done checking to see if all hosts have failed 12180 1727204071.16773: getting the remaining hosts for this loop 12180 1727204071.16775: done getting the remaining hosts for this loop 12180 1727204071.16779: getting the next task for host managed-node1 12180 1727204071.16787: done getting next task for host managed-node1 12180 1727204071.16792: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12180 1727204071.16795: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204071.16813: getting variables 12180 1727204071.16815: in VariableManager get_vars() 12180 1727204071.16862: Calling all_inventory to load vars for managed-node1 12180 1727204071.16867: Calling groups_inventory to load vars for managed-node1 12180 1727204071.16870: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204071.16883: Calling all_plugins_play to load vars for managed-node1 12180 1727204071.16886: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204071.16889: Calling groups_plugins_play to load vars for managed-node1 12180 1727204071.17517: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000002a 12180 1727204071.17521: WORKER PROCESS EXITING 12180 1727204071.19012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204071.20958: done with get_vars() 12180 1727204071.20994: done getting variables 12180 1727204071.21116: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.097) 0:00:18.623 ***** 12180 1727204071.21166: entering _queue_task() for managed-node1/dnf 12180 1727204071.21511: worker is 1 (out of 1 available) 12180 1727204071.21525: exiting _queue_task() for managed-node1/dnf 12180 1727204071.21541: done queuing things up, now waiting for results queue to drain 12180 1727204071.21543: waiting for pending results... 12180 1727204071.21865: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12180 1727204071.22032: in run() - task 0affcd87-79f5-ccb1-55ae-00000000002b 12180 1727204071.22052: variable 'ansible_search_path' from source: unknown 12180 1727204071.22061: variable 'ansible_search_path' from source: unknown 12180 1727204071.22118: calling self._execute() 12180 1727204071.22222: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204071.22245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204071.22259: variable 'omit' from source: magic vars 12180 1727204071.22698: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.22717: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204071.22951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204071.25673: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204071.25754: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204071.25790: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204071.25859: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204071.25869: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204071.25962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.25993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.26080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.26084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.26090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.26225: variable 'ansible_distribution' from source: facts 12180 1727204071.26236: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.26253: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12180 1727204071.26419: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204071.26572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.26576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.26600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.26644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.26659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.26712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.26737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.26761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.26815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.26829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.26872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.26900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.26932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.27018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.27022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.27174: variable 'network_connections' from source: task vars 12180 1727204071.27186: variable 'controller_profile' from source: play vars 12180 1727204071.27271: variable 'controller_profile' from source: play vars 12180 1727204071.27304: variable 'controller_device' from source: play vars 12180 1727204071.27351: variable 'controller_device' from source: play vars 12180 1727204071.27362: variable 'port1_profile' from source: play vars 12180 1727204071.27424: variable 'port1_profile' from source: play vars 12180 1727204071.27438: variable 'dhcp_interface1' from source: play vars 12180 1727204071.27574: variable 'dhcp_interface1' from source: play vars 12180 1727204071.27578: variable 'controller_profile' from source: play vars 12180 1727204071.27600: variable 'controller_profile' from source: play vars 12180 1727204071.27604: variable 'port2_profile' from source: play vars 12180 1727204071.27657: variable 'port2_profile' from source: play vars 12180 1727204071.27666: variable 'dhcp_interface2' from source: play vars 12180 1727204071.27731: variable 'dhcp_interface2' from source: play vars 12180 1727204071.27742: variable 'controller_profile' from source: play vars 12180 1727204071.27814: variable 'controller_profile' from source: play vars 12180 1727204071.27918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204071.28120: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204071.28160: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204071.28192: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204071.28238: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204071.28286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204071.28312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204071.28349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.28376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204071.28447: variable '__network_team_connections_defined' from source: role '' defaults 12180 1727204071.28609: variable 'network_connections' from source: task vars 12180 1727204071.28613: variable 'controller_profile' from source: play vars 12180 1727204071.28673: variable 'controller_profile' from source: play vars 12180 1727204071.28679: variable 'controller_device' from source: play vars 12180 1727204071.28721: variable 'controller_device' from source: play vars 12180 1727204071.28728: variable 'port1_profile' from source: play vars 12180 1727204071.28777: variable 'port1_profile' from source: play vars 12180 1727204071.28783: variable 'dhcp_interface1' from source: play vars 12180 1727204071.28825: variable 'dhcp_interface1' from source: play vars 12180 1727204071.28832: variable 'controller_profile' from source: play vars 12180 1727204071.28878: variable 'controller_profile' from source: play vars 12180 1727204071.28885: variable 'port2_profile' from source: play vars 12180 1727204071.28925: variable 'port2_profile' from source: play vars 12180 1727204071.28934: variable 'dhcp_interface2' from source: play vars 12180 1727204071.28983: variable 'dhcp_interface2' from source: play vars 12180 1727204071.28986: variable 'controller_profile' from source: play vars 12180 1727204071.29028: variable 'controller_profile' from source: play vars 12180 1727204071.29062: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12180 1727204071.29067: when evaluation is False, skipping this task 12180 1727204071.29069: _execute() done 12180 1727204071.29072: dumping result to json 12180 1727204071.29079: done dumping result, returning 12180 1727204071.29082: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-ccb1-55ae-00000000002b] 12180 1727204071.29088: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000002b 12180 1727204071.29186: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000002b 12180 1727204071.29189: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12180 1727204071.29238: no more pending results, returning what we have 12180 1727204071.29241: results queue empty 12180 1727204071.29242: checking for any_errors_fatal 12180 1727204071.29248: done checking for any_errors_fatal 12180 1727204071.29249: checking for max_fail_percentage 12180 1727204071.29250: done checking for max_fail_percentage 12180 1727204071.29251: checking to see if all hosts have failed and the running result is not ok 12180 1727204071.29252: done checking to see if all hosts have failed 12180 1727204071.29253: getting the remaining hosts for this loop 12180 1727204071.29254: done getting the remaining hosts for this loop 12180 1727204071.29258: getting the next task for host managed-node1 12180 1727204071.29266: done getting next task for host managed-node1 12180 1727204071.29269: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12180 1727204071.29273: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204071.29286: getting variables 12180 1727204071.29287: in VariableManager get_vars() 12180 1727204071.29328: Calling all_inventory to load vars for managed-node1 12180 1727204071.29331: Calling groups_inventory to load vars for managed-node1 12180 1727204071.29333: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204071.29344: Calling all_plugins_play to load vars for managed-node1 12180 1727204071.29346: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204071.29349: Calling groups_plugins_play to load vars for managed-node1 12180 1727204071.30332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204071.31419: done with get_vars() 12180 1727204071.31441: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12180 1727204071.31526: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.103) 0:00:18.727 ***** 12180 1727204071.31559: entering _queue_task() for managed-node1/yum 12180 1727204071.31560: Creating lock for yum 12180 1727204071.31861: worker is 1 (out of 1 available) 12180 1727204071.31873: exiting _queue_task() for managed-node1/yum 12180 1727204071.31884: done queuing things up, now waiting for results queue to drain 12180 1727204071.31885: waiting for pending results... 12180 1727204071.32146: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12180 1727204071.32255: in run() - task 0affcd87-79f5-ccb1-55ae-00000000002c 12180 1727204071.32271: variable 'ansible_search_path' from source: unknown 12180 1727204071.32275: variable 'ansible_search_path' from source: unknown 12180 1727204071.32305: calling self._execute() 12180 1727204071.32388: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204071.32392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204071.32400: variable 'omit' from source: magic vars 12180 1727204071.32678: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.32687: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204071.32813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204071.34452: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204071.34500: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204071.34530: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204071.34560: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204071.34583: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204071.34642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.34666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.34686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.34714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.34727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.34801: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.34814: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12180 1727204071.34819: when evaluation is False, skipping this task 12180 1727204071.34822: _execute() done 12180 1727204071.34825: dumping result to json 12180 1727204071.34827: done dumping result, returning 12180 1727204071.34835: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-ccb1-55ae-00000000002c] 12180 1727204071.34841: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000002c 12180 1727204071.34933: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000002c 12180 1727204071.34937: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12180 1727204071.34992: no more pending results, returning what we have 12180 1727204071.34995: results queue empty 12180 1727204071.34996: checking for any_errors_fatal 12180 1727204071.35002: done checking for any_errors_fatal 12180 1727204071.35002: checking for max_fail_percentage 12180 1727204071.35004: done checking for max_fail_percentage 12180 1727204071.35005: checking to see if all hosts have failed and the running result is not ok 12180 1727204071.35006: done checking to see if all hosts have failed 12180 1727204071.35007: getting the remaining hosts for this loop 12180 1727204071.35008: done getting the remaining hosts for this loop 12180 1727204071.35012: getting the next task for host managed-node1 12180 1727204071.35019: done getting next task for host managed-node1 12180 1727204071.35022: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12180 1727204071.35025: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204071.35040: getting variables 12180 1727204071.35041: in VariableManager get_vars() 12180 1727204071.35083: Calling all_inventory to load vars for managed-node1 12180 1727204071.35086: Calling groups_inventory to load vars for managed-node1 12180 1727204071.35088: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204071.35096: Calling all_plugins_play to load vars for managed-node1 12180 1727204071.35098: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204071.35100: Calling groups_plugins_play to load vars for managed-node1 12180 1727204071.35898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204071.36832: done with get_vars() 12180 1727204071.36850: done getting variables 12180 1727204071.36897: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.053) 0:00:18.781 ***** 12180 1727204071.36923: entering _queue_task() for managed-node1/fail 12180 1727204071.37158: worker is 1 (out of 1 available) 12180 1727204071.37173: exiting _queue_task() for managed-node1/fail 12180 1727204071.37186: done queuing things up, now waiting for results queue to drain 12180 1727204071.37188: waiting for pending results... 12180 1727204071.37366: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12180 1727204071.37471: in run() - task 0affcd87-79f5-ccb1-55ae-00000000002d 12180 1727204071.37479: variable 'ansible_search_path' from source: unknown 12180 1727204071.37483: variable 'ansible_search_path' from source: unknown 12180 1727204071.37512: calling self._execute() 12180 1727204071.37580: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204071.37583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204071.37591: variable 'omit' from source: magic vars 12180 1727204071.37870: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.37882: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204071.37962: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204071.38103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204071.40512: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204071.40575: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204071.40607: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204071.40641: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204071.40666: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204071.40739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.40766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.40789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.40834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.40843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.40877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.40893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.40909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.40943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.40951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.40981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.40997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.41013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.41044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.41053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.41169: variable 'network_connections' from source: task vars 12180 1727204071.41179: variable 'controller_profile' from source: play vars 12180 1727204071.41231: variable 'controller_profile' from source: play vars 12180 1727204071.41241: variable 'controller_device' from source: play vars 12180 1727204071.41292: variable 'controller_device' from source: play vars 12180 1727204071.41301: variable 'port1_profile' from source: play vars 12180 1727204071.41346: variable 'port1_profile' from source: play vars 12180 1727204071.41353: variable 'dhcp_interface1' from source: play vars 12180 1727204071.41400: variable 'dhcp_interface1' from source: play vars 12180 1727204071.41406: variable 'controller_profile' from source: play vars 12180 1727204071.41450: variable 'controller_profile' from source: play vars 12180 1727204071.41456: variable 'port2_profile' from source: play vars 12180 1727204071.41502: variable 'port2_profile' from source: play vars 12180 1727204071.41508: variable 'dhcp_interface2' from source: play vars 12180 1727204071.41552: variable 'dhcp_interface2' from source: play vars 12180 1727204071.41558: variable 'controller_profile' from source: play vars 12180 1727204071.41605: variable 'controller_profile' from source: play vars 12180 1727204071.41656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204071.41789: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204071.41819: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204071.41846: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204071.41870: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204071.41904: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204071.41924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204071.41944: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.41962: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204071.42015: variable '__network_team_connections_defined' from source: role '' defaults 12180 1727204071.42173: variable 'network_connections' from source: task vars 12180 1727204071.42177: variable 'controller_profile' from source: play vars 12180 1727204071.42220: variable 'controller_profile' from source: play vars 12180 1727204071.42225: variable 'controller_device' from source: play vars 12180 1727204071.42274: variable 'controller_device' from source: play vars 12180 1727204071.42282: variable 'port1_profile' from source: play vars 12180 1727204071.42323: variable 'port1_profile' from source: play vars 12180 1727204071.42329: variable 'dhcp_interface1' from source: play vars 12180 1727204071.42378: variable 'dhcp_interface1' from source: play vars 12180 1727204071.42384: variable 'controller_profile' from source: play vars 12180 1727204071.42426: variable 'controller_profile' from source: play vars 12180 1727204071.42434: variable 'port2_profile' from source: play vars 12180 1727204071.42480: variable 'port2_profile' from source: play vars 12180 1727204071.42486: variable 'dhcp_interface2' from source: play vars 12180 1727204071.42528: variable 'dhcp_interface2' from source: play vars 12180 1727204071.42536: variable 'controller_profile' from source: play vars 12180 1727204071.42585: variable 'controller_profile' from source: play vars 12180 1727204071.42608: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12180 1727204071.42611: when evaluation is False, skipping this task 12180 1727204071.42614: _execute() done 12180 1727204071.42616: dumping result to json 12180 1727204071.42618: done dumping result, returning 12180 1727204071.42625: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-ccb1-55ae-00000000002d] 12180 1727204071.42633: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000002d skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12180 1727204071.42778: no more pending results, returning what we have 12180 1727204071.42782: results queue empty 12180 1727204071.42782: checking for any_errors_fatal 12180 1727204071.42788: done checking for any_errors_fatal 12180 1727204071.42789: checking for max_fail_percentage 12180 1727204071.42791: done checking for max_fail_percentage 12180 1727204071.42792: checking to see if all hosts have failed and the running result is not ok 12180 1727204071.42793: done checking to see if all hosts have failed 12180 1727204071.42793: getting the remaining hosts for this loop 12180 1727204071.42795: done getting the remaining hosts for this loop 12180 1727204071.42799: getting the next task for host managed-node1 12180 1727204071.42806: done getting next task for host managed-node1 12180 1727204071.42810: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12180 1727204071.42813: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204071.42827: getting variables 12180 1727204071.42828: in VariableManager get_vars() 12180 1727204071.42873: Calling all_inventory to load vars for managed-node1 12180 1727204071.42876: Calling groups_inventory to load vars for managed-node1 12180 1727204071.42878: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204071.42888: Calling all_plugins_play to load vars for managed-node1 12180 1727204071.42890: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204071.42892: Calling groups_plugins_play to load vars for managed-node1 12180 1727204071.43852: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000002d 12180 1727204071.43859: WORKER PROCESS EXITING 12180 1727204071.43870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204071.44793: done with get_vars() 12180 1727204071.44814: done getting variables 12180 1727204071.44862: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.079) 0:00:18.860 ***** 12180 1727204071.44888: entering _queue_task() for managed-node1/package 12180 1727204071.45123: worker is 1 (out of 1 available) 12180 1727204071.45139: exiting _queue_task() for managed-node1/package 12180 1727204071.45151: done queuing things up, now waiting for results queue to drain 12180 1727204071.45153: waiting for pending results... 12180 1727204071.45337: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 12180 1727204071.45416: in run() - task 0affcd87-79f5-ccb1-55ae-00000000002e 12180 1727204071.45426: variable 'ansible_search_path' from source: unknown 12180 1727204071.45432: variable 'ansible_search_path' from source: unknown 12180 1727204071.45461: calling self._execute() 12180 1727204071.45531: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204071.45535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204071.45541: variable 'omit' from source: magic vars 12180 1727204071.45818: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.45831: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204071.45969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204071.46170: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204071.46203: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204071.46233: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204071.46258: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204071.46336: variable 'network_packages' from source: role '' defaults 12180 1727204071.46413: variable '__network_provider_setup' from source: role '' defaults 12180 1727204071.46422: variable '__network_service_name_default_nm' from source: role '' defaults 12180 1727204071.46473: variable '__network_service_name_default_nm' from source: role '' defaults 12180 1727204071.46480: variable '__network_packages_default_nm' from source: role '' defaults 12180 1727204071.46526: variable '__network_packages_default_nm' from source: role '' defaults 12180 1727204071.46673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204071.48126: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204071.48181: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204071.48213: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204071.48237: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204071.48258: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204071.48322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.48342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.48360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.48388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.48399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.48436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.48453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.48471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.48496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.48506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.48661: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12180 1727204071.48741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.48760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.48779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.48804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.48816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.48886: variable 'ansible_python' from source: facts 12180 1727204071.48907: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12180 1727204071.48973: variable '__network_wpa_supplicant_required' from source: role '' defaults 12180 1727204071.49032: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12180 1727204071.49119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.49137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.49154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.49184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.49194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.49230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.49248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.49266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.49294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.49306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.49403: variable 'network_connections' from source: task vars 12180 1727204071.49410: variable 'controller_profile' from source: play vars 12180 1727204071.49482: variable 'controller_profile' from source: play vars 12180 1727204071.49490: variable 'controller_device' from source: play vars 12180 1727204071.49566: variable 'controller_device' from source: play vars 12180 1727204071.49577: variable 'port1_profile' from source: play vars 12180 1727204071.49651: variable 'port1_profile' from source: play vars 12180 1727204071.49658: variable 'dhcp_interface1' from source: play vars 12180 1727204071.49733: variable 'dhcp_interface1' from source: play vars 12180 1727204071.49739: variable 'controller_profile' from source: play vars 12180 1727204071.49810: variable 'controller_profile' from source: play vars 12180 1727204071.49822: variable 'port2_profile' from source: play vars 12180 1727204071.49892: variable 'port2_profile' from source: play vars 12180 1727204071.49900: variable 'dhcp_interface2' from source: play vars 12180 1727204071.49973: variable 'dhcp_interface2' from source: play vars 12180 1727204071.49981: variable 'controller_profile' from source: play vars 12180 1727204071.50056: variable 'controller_profile' from source: play vars 12180 1727204071.50107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204071.50126: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204071.50152: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.50176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204071.50213: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204071.50402: variable 'network_connections' from source: task vars 12180 1727204071.50405: variable 'controller_profile' from source: play vars 12180 1727204071.50481: variable 'controller_profile' from source: play vars 12180 1727204071.50493: variable 'controller_device' from source: play vars 12180 1727204071.50558: variable 'controller_device' from source: play vars 12180 1727204071.50569: variable 'port1_profile' from source: play vars 12180 1727204071.50641: variable 'port1_profile' from source: play vars 12180 1727204071.50649: variable 'dhcp_interface1' from source: play vars 12180 1727204071.50722: variable 'dhcp_interface1' from source: play vars 12180 1727204071.50732: variable 'controller_profile' from source: play vars 12180 1727204071.50804: variable 'controller_profile' from source: play vars 12180 1727204071.50808: variable 'port2_profile' from source: play vars 12180 1727204071.50882: variable 'port2_profile' from source: play vars 12180 1727204071.50889: variable 'dhcp_interface2' from source: play vars 12180 1727204071.50962: variable 'dhcp_interface2' from source: play vars 12180 1727204071.50971: variable 'controller_profile' from source: play vars 12180 1727204071.51042: variable 'controller_profile' from source: play vars 12180 1727204071.51084: variable '__network_packages_default_wireless' from source: role '' defaults 12180 1727204071.51146: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204071.51365: variable 'network_connections' from source: task vars 12180 1727204071.51370: variable 'controller_profile' from source: play vars 12180 1727204071.51414: variable 'controller_profile' from source: play vars 12180 1727204071.51420: variable 'controller_device' from source: play vars 12180 1727204071.51472: variable 'controller_device' from source: play vars 12180 1727204071.51476: variable 'port1_profile' from source: play vars 12180 1727204071.51522: variable 'port1_profile' from source: play vars 12180 1727204071.51530: variable 'dhcp_interface1' from source: play vars 12180 1727204071.51582: variable 'dhcp_interface1' from source: play vars 12180 1727204071.51585: variable 'controller_profile' from source: play vars 12180 1727204071.51631: variable 'controller_profile' from source: play vars 12180 1727204071.51635: variable 'port2_profile' from source: play vars 12180 1727204071.51685: variable 'port2_profile' from source: play vars 12180 1727204071.51695: variable 'dhcp_interface2' from source: play vars 12180 1727204071.51743: variable 'dhcp_interface2' from source: play vars 12180 1727204071.51748: variable 'controller_profile' from source: play vars 12180 1727204071.51802: variable 'controller_profile' from source: play vars 12180 1727204071.51820: variable '__network_packages_default_team' from source: role '' defaults 12180 1727204071.51877: variable '__network_team_connections_defined' from source: role '' defaults 12180 1727204071.52094: variable 'network_connections' from source: task vars 12180 1727204071.52098: variable 'controller_profile' from source: play vars 12180 1727204071.52149: variable 'controller_profile' from source: play vars 12180 1727204071.52155: variable 'controller_device' from source: play vars 12180 1727204071.52202: variable 'controller_device' from source: play vars 12180 1727204071.52209: variable 'port1_profile' from source: play vars 12180 1727204071.52258: variable 'port1_profile' from source: play vars 12180 1727204071.52266: variable 'dhcp_interface1' from source: play vars 12180 1727204071.52310: variable 'dhcp_interface1' from source: play vars 12180 1727204071.52315: variable 'controller_profile' from source: play vars 12180 1727204071.52366: variable 'controller_profile' from source: play vars 12180 1727204071.52372: variable 'port2_profile' from source: play vars 12180 1727204071.52417: variable 'port2_profile' from source: play vars 12180 1727204071.52423: variable 'dhcp_interface2' from source: play vars 12180 1727204071.52477: variable 'dhcp_interface2' from source: play vars 12180 1727204071.52483: variable 'controller_profile' from source: play vars 12180 1727204071.52530: variable 'controller_profile' from source: play vars 12180 1727204071.52582: variable '__network_service_name_default_initscripts' from source: role '' defaults 12180 1727204071.52623: variable '__network_service_name_default_initscripts' from source: role '' defaults 12180 1727204071.52631: variable '__network_packages_default_initscripts' from source: role '' defaults 12180 1727204071.52676: variable '__network_packages_default_initscripts' from source: role '' defaults 12180 1727204071.52813: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12180 1727204071.53120: variable 'network_connections' from source: task vars 12180 1727204071.53124: variable 'controller_profile' from source: play vars 12180 1727204071.53168: variable 'controller_profile' from source: play vars 12180 1727204071.53177: variable 'controller_device' from source: play vars 12180 1727204071.53219: variable 'controller_device' from source: play vars 12180 1727204071.53226: variable 'port1_profile' from source: play vars 12180 1727204071.53270: variable 'port1_profile' from source: play vars 12180 1727204071.53276: variable 'dhcp_interface1' from source: play vars 12180 1727204071.53318: variable 'dhcp_interface1' from source: play vars 12180 1727204071.53328: variable 'controller_profile' from source: play vars 12180 1727204071.53368: variable 'controller_profile' from source: play vars 12180 1727204071.53375: variable 'port2_profile' from source: play vars 12180 1727204071.53416: variable 'port2_profile' from source: play vars 12180 1727204071.53423: variable 'dhcp_interface2' from source: play vars 12180 1727204071.53468: variable 'dhcp_interface2' from source: play vars 12180 1727204071.53474: variable 'controller_profile' from source: play vars 12180 1727204071.53515: variable 'controller_profile' from source: play vars 12180 1727204071.53522: variable 'ansible_distribution' from source: facts 12180 1727204071.53525: variable '__network_rh_distros' from source: role '' defaults 12180 1727204071.53534: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.53555: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12180 1727204071.53665: variable 'ansible_distribution' from source: facts 12180 1727204071.53671: variable '__network_rh_distros' from source: role '' defaults 12180 1727204071.53676: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.53687: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12180 1727204071.53797: variable 'ansible_distribution' from source: facts 12180 1727204071.53800: variable '__network_rh_distros' from source: role '' defaults 12180 1727204071.53804: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.53834: variable 'network_provider' from source: set_fact 12180 1727204071.53844: variable 'ansible_facts' from source: unknown 12180 1727204071.54310: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12180 1727204071.54315: when evaluation is False, skipping this task 12180 1727204071.54317: _execute() done 12180 1727204071.54321: dumping result to json 12180 1727204071.54323: done dumping result, returning 12180 1727204071.54325: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-ccb1-55ae-00000000002e] 12180 1727204071.54330: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000002e 12180 1727204071.54426: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000002e 12180 1727204071.54432: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12180 1727204071.54486: no more pending results, returning what we have 12180 1727204071.54489: results queue empty 12180 1727204071.54490: checking for any_errors_fatal 12180 1727204071.54496: done checking for any_errors_fatal 12180 1727204071.54496: checking for max_fail_percentage 12180 1727204071.54498: done checking for max_fail_percentage 12180 1727204071.54499: checking to see if all hosts have failed and the running result is not ok 12180 1727204071.54500: done checking to see if all hosts have failed 12180 1727204071.54500: getting the remaining hosts for this loop 12180 1727204071.54502: done getting the remaining hosts for this loop 12180 1727204071.54505: getting the next task for host managed-node1 12180 1727204071.54515: done getting next task for host managed-node1 12180 1727204071.54526: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12180 1727204071.54531: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204071.54546: getting variables 12180 1727204071.54547: in VariableManager get_vars() 12180 1727204071.54587: Calling all_inventory to load vars for managed-node1 12180 1727204071.54590: Calling groups_inventory to load vars for managed-node1 12180 1727204071.54592: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204071.54601: Calling all_plugins_play to load vars for managed-node1 12180 1727204071.54603: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204071.54606: Calling groups_plugins_play to load vars for managed-node1 12180 1727204071.55445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204071.56380: done with get_vars() 12180 1727204071.56398: done getting variables 12180 1727204071.56444: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.115) 0:00:18.976 ***** 12180 1727204071.56474: entering _queue_task() for managed-node1/package 12180 1727204071.56698: worker is 1 (out of 1 available) 12180 1727204071.56711: exiting _queue_task() for managed-node1/package 12180 1727204071.56724: done queuing things up, now waiting for results queue to drain 12180 1727204071.56726: waiting for pending results... 12180 1727204071.56908: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12180 1727204071.56992: in run() - task 0affcd87-79f5-ccb1-55ae-00000000002f 12180 1727204071.57003: variable 'ansible_search_path' from source: unknown 12180 1727204071.57007: variable 'ansible_search_path' from source: unknown 12180 1727204071.57040: calling self._execute() 12180 1727204071.57105: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204071.57108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204071.57117: variable 'omit' from source: magic vars 12180 1727204071.57390: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.57401: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204071.57489: variable 'network_state' from source: role '' defaults 12180 1727204071.57498: Evaluated conditional (network_state != {}): False 12180 1727204071.57501: when evaluation is False, skipping this task 12180 1727204071.57503: _execute() done 12180 1727204071.57506: dumping result to json 12180 1727204071.57511: done dumping result, returning 12180 1727204071.57517: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-ccb1-55ae-00000000002f] 12180 1727204071.57522: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000002f 12180 1727204071.57613: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000002f 12180 1727204071.57616: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12180 1727204071.57672: no more pending results, returning what we have 12180 1727204071.57675: results queue empty 12180 1727204071.57676: checking for any_errors_fatal 12180 1727204071.57681: done checking for any_errors_fatal 12180 1727204071.57682: checking for max_fail_percentage 12180 1727204071.57683: done checking for max_fail_percentage 12180 1727204071.57684: checking to see if all hosts have failed and the running result is not ok 12180 1727204071.57685: done checking to see if all hosts have failed 12180 1727204071.57686: getting the remaining hosts for this loop 12180 1727204071.57687: done getting the remaining hosts for this loop 12180 1727204071.57691: getting the next task for host managed-node1 12180 1727204071.57697: done getting next task for host managed-node1 12180 1727204071.57701: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12180 1727204071.57704: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204071.57718: getting variables 12180 1727204071.57719: in VariableManager get_vars() 12180 1727204071.57767: Calling all_inventory to load vars for managed-node1 12180 1727204071.57770: Calling groups_inventory to load vars for managed-node1 12180 1727204071.57772: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204071.57780: Calling all_plugins_play to load vars for managed-node1 12180 1727204071.57783: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204071.57785: Calling groups_plugins_play to load vars for managed-node1 12180 1727204071.58749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204071.59687: done with get_vars() 12180 1727204071.59706: done getting variables 12180 1727204071.59752: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.033) 0:00:19.009 ***** 12180 1727204071.59779: entering _queue_task() for managed-node1/package 12180 1727204071.60007: worker is 1 (out of 1 available) 12180 1727204071.60023: exiting _queue_task() for managed-node1/package 12180 1727204071.60037: done queuing things up, now waiting for results queue to drain 12180 1727204071.60039: waiting for pending results... 12180 1727204071.60225: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12180 1727204071.60315: in run() - task 0affcd87-79f5-ccb1-55ae-000000000030 12180 1727204071.60325: variable 'ansible_search_path' from source: unknown 12180 1727204071.60331: variable 'ansible_search_path' from source: unknown 12180 1727204071.60370: calling self._execute() 12180 1727204071.60629: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204071.60640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204071.60652: variable 'omit' from source: magic vars 12180 1727204071.61012: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.61029: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204071.61153: variable 'network_state' from source: role '' defaults 12180 1727204071.61283: Evaluated conditional (network_state != {}): False 12180 1727204071.61292: when evaluation is False, skipping this task 12180 1727204071.61299: _execute() done 12180 1727204071.61306: dumping result to json 12180 1727204071.61313: done dumping result, returning 12180 1727204071.61324: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-ccb1-55ae-000000000030] 12180 1727204071.61336: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000030 12180 1727204071.61453: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000030 12180 1727204071.61460: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12180 1727204071.61519: no more pending results, returning what we have 12180 1727204071.61525: results queue empty 12180 1727204071.61526: checking for any_errors_fatal 12180 1727204071.61533: done checking for any_errors_fatal 12180 1727204071.61533: checking for max_fail_percentage 12180 1727204071.61535: done checking for max_fail_percentage 12180 1727204071.61536: checking to see if all hosts have failed and the running result is not ok 12180 1727204071.61537: done checking to see if all hosts have failed 12180 1727204071.61537: getting the remaining hosts for this loop 12180 1727204071.61541: done getting the remaining hosts for this loop 12180 1727204071.61544: getting the next task for host managed-node1 12180 1727204071.61551: done getting next task for host managed-node1 12180 1727204071.61555: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12180 1727204071.61558: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204071.61575: getting variables 12180 1727204071.61577: in VariableManager get_vars() 12180 1727204071.61619: Calling all_inventory to load vars for managed-node1 12180 1727204071.61623: Calling groups_inventory to load vars for managed-node1 12180 1727204071.61625: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204071.61638: Calling all_plugins_play to load vars for managed-node1 12180 1727204071.61641: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204071.61643: Calling groups_plugins_play to load vars for managed-node1 12180 1727204071.64083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204071.71229: done with get_vars() 12180 1727204071.71260: done getting variables 12180 1727204071.71353: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.116) 0:00:19.125 ***** 12180 1727204071.71387: entering _queue_task() for managed-node1/service 12180 1727204071.71389: Creating lock for service 12180 1727204071.71721: worker is 1 (out of 1 available) 12180 1727204071.71734: exiting _queue_task() for managed-node1/service 12180 1727204071.71747: done queuing things up, now waiting for results queue to drain 12180 1727204071.71749: waiting for pending results... 12180 1727204071.72038: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12180 1727204071.72192: in run() - task 0affcd87-79f5-ccb1-55ae-000000000031 12180 1727204071.72211: variable 'ansible_search_path' from source: unknown 12180 1727204071.72223: variable 'ansible_search_path' from source: unknown 12180 1727204071.72268: calling self._execute() 12180 1727204071.72369: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204071.72381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204071.72395: variable 'omit' from source: magic vars 12180 1727204071.72789: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.72810: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204071.72926: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204071.73131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204071.75676: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204071.75762: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204071.75814: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204071.75854: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204071.75892: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204071.76652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.76742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.76846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.76895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.76942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.77082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.77111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.77255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.77304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.77324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.77490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.77517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.77548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.77595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.77698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.78005: variable 'network_connections' from source: task vars 12180 1727204071.78139: variable 'controller_profile' from source: play vars 12180 1727204071.78217: variable 'controller_profile' from source: play vars 12180 1727204071.78351: variable 'controller_device' from source: play vars 12180 1727204071.78426: variable 'controller_device' from source: play vars 12180 1727204071.78450: variable 'port1_profile' from source: play vars 12180 1727204071.78518: variable 'port1_profile' from source: play vars 12180 1727204071.78677: variable 'dhcp_interface1' from source: play vars 12180 1727204071.78741: variable 'dhcp_interface1' from source: play vars 12180 1727204071.78752: variable 'controller_profile' from source: play vars 12180 1727204071.78831: variable 'controller_profile' from source: play vars 12180 1727204071.79001: variable 'port2_profile' from source: play vars 12180 1727204071.79061: variable 'port2_profile' from source: play vars 12180 1727204071.79076: variable 'dhcp_interface2' from source: play vars 12180 1727204071.79144: variable 'dhcp_interface2' from source: play vars 12180 1727204071.79218: variable 'controller_profile' from source: play vars 12180 1727204071.79283: variable 'controller_profile' from source: play vars 12180 1727204071.79501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204071.79766: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204071.79812: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204071.79849: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204071.79893: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204071.79946: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204071.79986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204071.80018: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.80049: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204071.80134: variable '__network_team_connections_defined' from source: role '' defaults 12180 1727204071.80413: variable 'network_connections' from source: task vars 12180 1727204071.80423: variable 'controller_profile' from source: play vars 12180 1727204071.80492: variable 'controller_profile' from source: play vars 12180 1727204071.80509: variable 'controller_device' from source: play vars 12180 1727204071.80572: variable 'controller_device' from source: play vars 12180 1727204071.80586: variable 'port1_profile' from source: play vars 12180 1727204071.80651: variable 'port1_profile' from source: play vars 12180 1727204071.80661: variable 'dhcp_interface1' from source: play vars 12180 1727204071.80719: variable 'dhcp_interface1' from source: play vars 12180 1727204071.80737: variable 'controller_profile' from source: play vars 12180 1727204071.80796: variable 'controller_profile' from source: play vars 12180 1727204071.80808: variable 'port2_profile' from source: play vars 12180 1727204071.80878: variable 'port2_profile' from source: play vars 12180 1727204071.80890: variable 'dhcp_interface2' from source: play vars 12180 1727204071.80980: variable 'dhcp_interface2' from source: play vars 12180 1727204071.80992: variable 'controller_profile' from source: play vars 12180 1727204071.81051: variable 'controller_profile' from source: play vars 12180 1727204071.81094: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12180 1727204071.81103: when evaluation is False, skipping this task 12180 1727204071.81111: _execute() done 12180 1727204071.81117: dumping result to json 12180 1727204071.81124: done dumping result, returning 12180 1727204071.81137: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-ccb1-55ae-000000000031] 12180 1727204071.81147: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000031 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12180 1727204071.81316: no more pending results, returning what we have 12180 1727204071.81321: results queue empty 12180 1727204071.81322: checking for any_errors_fatal 12180 1727204071.81330: done checking for any_errors_fatal 12180 1727204071.81331: checking for max_fail_percentage 12180 1727204071.81333: done checking for max_fail_percentage 12180 1727204071.81334: checking to see if all hosts have failed and the running result is not ok 12180 1727204071.81335: done checking to see if all hosts have failed 12180 1727204071.81336: getting the remaining hosts for this loop 12180 1727204071.81337: done getting the remaining hosts for this loop 12180 1727204071.81342: getting the next task for host managed-node1 12180 1727204071.81350: done getting next task for host managed-node1 12180 1727204071.81355: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12180 1727204071.81359: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204071.81376: getting variables 12180 1727204071.81378: in VariableManager get_vars() 12180 1727204071.81426: Calling all_inventory to load vars for managed-node1 12180 1727204071.81429: Calling groups_inventory to load vars for managed-node1 12180 1727204071.81432: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204071.81444: Calling all_plugins_play to load vars for managed-node1 12180 1727204071.81446: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204071.81450: Calling groups_plugins_play to load vars for managed-node1 12180 1727204071.82407: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000031 12180 1727204071.82411: WORKER PROCESS EXITING 12180 1727204071.83336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204071.85097: done with get_vars() 12180 1727204071.85126: done getting variables 12180 1727204071.85195: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.138) 0:00:19.264 ***** 12180 1727204071.85230: entering _queue_task() for managed-node1/service 12180 1727204071.85561: worker is 1 (out of 1 available) 12180 1727204071.85580: exiting _queue_task() for managed-node1/service 12180 1727204071.85593: done queuing things up, now waiting for results queue to drain 12180 1727204071.85595: waiting for pending results... 12180 1727204071.85882: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12180 1727204071.86026: in run() - task 0affcd87-79f5-ccb1-55ae-000000000032 12180 1727204071.86050: variable 'ansible_search_path' from source: unknown 12180 1727204071.86058: variable 'ansible_search_path' from source: unknown 12180 1727204071.86099: calling self._execute() 12180 1727204071.86199: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204071.86212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204071.86232: variable 'omit' from source: magic vars 12180 1727204071.86621: variable 'ansible_distribution_major_version' from source: facts 12180 1727204071.86682: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204071.86869: variable 'network_provider' from source: set_fact 12180 1727204071.86885: variable 'network_state' from source: role '' defaults 12180 1727204071.86899: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12180 1727204071.86911: variable 'omit' from source: magic vars 12180 1727204071.86957: variable 'omit' from source: magic vars 12180 1727204071.86994: variable 'network_service_name' from source: role '' defaults 12180 1727204071.87061: variable 'network_service_name' from source: role '' defaults 12180 1727204071.87761: variable '__network_provider_setup' from source: role '' defaults 12180 1727204071.87775: variable '__network_service_name_default_nm' from source: role '' defaults 12180 1727204071.87847: variable '__network_service_name_default_nm' from source: role '' defaults 12180 1727204071.87866: variable '__network_packages_default_nm' from source: role '' defaults 12180 1727204071.87936: variable '__network_packages_default_nm' from source: role '' defaults 12180 1727204071.88178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204071.90630: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204071.90730: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204071.90776: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204071.90824: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204071.90857: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204071.90947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.90983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.91019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.91072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.91092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.91148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.91179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.91209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.91261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.91284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.91547: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12180 1727204071.91686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.91717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.91747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.91898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.91922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.92137: variable 'ansible_python' from source: facts 12180 1727204071.92168: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12180 1727204071.92292: variable '__network_wpa_supplicant_required' from source: role '' defaults 12180 1727204071.92513: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12180 1727204071.92803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.92890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.92920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.93015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.93088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.93144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204071.93233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204071.93379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.93483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204071.93511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204071.93692: variable 'network_connections' from source: task vars 12180 1727204071.93705: variable 'controller_profile' from source: play vars 12180 1727204071.93794: variable 'controller_profile' from source: play vars 12180 1727204071.93811: variable 'controller_device' from source: play vars 12180 1727204071.93914: variable 'controller_device' from source: play vars 12180 1727204071.93933: variable 'port1_profile' from source: play vars 12180 1727204071.94015: variable 'port1_profile' from source: play vars 12180 1727204071.94031: variable 'dhcp_interface1' from source: play vars 12180 1727204071.94114: variable 'dhcp_interface1' from source: play vars 12180 1727204071.94129: variable 'controller_profile' from source: play vars 12180 1727204071.94211: variable 'controller_profile' from source: play vars 12180 1727204071.94228: variable 'port2_profile' from source: play vars 12180 1727204071.94326: variable 'port2_profile' from source: play vars 12180 1727204071.94343: variable 'dhcp_interface2' from source: play vars 12180 1727204071.94428: variable 'dhcp_interface2' from source: play vars 12180 1727204071.94445: variable 'controller_profile' from source: play vars 12180 1727204071.94531: variable 'controller_profile' from source: play vars 12180 1727204071.94651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204071.95409: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204071.95467: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204071.95520: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204071.95567: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204071.95641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204071.95690: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204071.95780: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204071.95819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204071.96013: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204071.96569: variable 'network_connections' from source: task vars 12180 1727204071.96686: variable 'controller_profile' from source: play vars 12180 1727204071.96885: variable 'controller_profile' from source: play vars 12180 1727204071.96902: variable 'controller_device' from source: play vars 12180 1727204071.97012: variable 'controller_device' from source: play vars 12180 1727204071.97162: variable 'port1_profile' from source: play vars 12180 1727204071.97291: variable 'port1_profile' from source: play vars 12180 1727204071.97359: variable 'dhcp_interface1' from source: play vars 12180 1727204071.97448: variable 'dhcp_interface1' from source: play vars 12180 1727204071.97479: variable 'controller_profile' from source: play vars 12180 1727204071.97808: variable 'controller_profile' from source: play vars 12180 1727204071.97826: variable 'port2_profile' from source: play vars 12180 1727204071.97910: variable 'port2_profile' from source: play vars 12180 1727204071.97926: variable 'dhcp_interface2' from source: play vars 12180 1727204071.98011: variable 'dhcp_interface2' from source: play vars 12180 1727204071.98026: variable 'controller_profile' from source: play vars 12180 1727204071.98112: variable 'controller_profile' from source: play vars 12180 1727204071.98171: variable '__network_packages_default_wireless' from source: role '' defaults 12180 1727204071.98261: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204071.98570: variable 'network_connections' from source: task vars 12180 1727204071.98579: variable 'controller_profile' from source: play vars 12180 1727204071.98652: variable 'controller_profile' from source: play vars 12180 1727204071.98667: variable 'controller_device' from source: play vars 12180 1727204071.98744: variable 'controller_device' from source: play vars 12180 1727204071.98760: variable 'port1_profile' from source: play vars 12180 1727204071.98831: variable 'port1_profile' from source: play vars 12180 1727204071.98850: variable 'dhcp_interface1' from source: play vars 12180 1727204071.98922: variable 'dhcp_interface1' from source: play vars 12180 1727204071.98934: variable 'controller_profile' from source: play vars 12180 1727204071.99014: variable 'controller_profile' from source: play vars 12180 1727204071.99027: variable 'port2_profile' from source: play vars 12180 1727204071.99103: variable 'port2_profile' from source: play vars 12180 1727204071.99115: variable 'dhcp_interface2' from source: play vars 12180 1727204071.99208: variable 'dhcp_interface2' from source: play vars 12180 1727204071.99220: variable 'controller_profile' from source: play vars 12180 1727204071.99409: variable 'controller_profile' from source: play vars 12180 1727204071.99441: variable '__network_packages_default_team' from source: role '' defaults 12180 1727204071.99531: variable '__network_team_connections_defined' from source: role '' defaults 12180 1727204072.00006: variable 'network_connections' from source: task vars 12180 1727204072.00020: variable 'controller_profile' from source: play vars 12180 1727204072.00099: variable 'controller_profile' from source: play vars 12180 1727204072.00111: variable 'controller_device' from source: play vars 12180 1727204072.00192: variable 'controller_device' from source: play vars 12180 1727204072.00206: variable 'port1_profile' from source: play vars 12180 1727204072.00323: variable 'port1_profile' from source: play vars 12180 1727204072.00335: variable 'dhcp_interface1' from source: play vars 12180 1727204072.00415: variable 'dhcp_interface1' from source: play vars 12180 1727204072.00472: variable 'controller_profile' from source: play vars 12180 1727204072.00587: variable 'controller_profile' from source: play vars 12180 1727204072.00601: variable 'port2_profile' from source: play vars 12180 1727204072.00675: variable 'port2_profile' from source: play vars 12180 1727204072.00694: variable 'dhcp_interface2' from source: play vars 12180 1727204072.00818: variable 'dhcp_interface2' from source: play vars 12180 1727204072.00829: variable 'controller_profile' from source: play vars 12180 1727204072.00902: variable 'controller_profile' from source: play vars 12180 1727204072.00981: variable '__network_service_name_default_initscripts' from source: role '' defaults 12180 1727204072.01054: variable '__network_service_name_default_initscripts' from source: role '' defaults 12180 1727204072.01069: variable '__network_packages_default_initscripts' from source: role '' defaults 12180 1727204072.01136: variable '__network_packages_default_initscripts' from source: role '' defaults 12180 1727204072.01378: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12180 1727204072.02067: variable 'network_connections' from source: task vars 12180 1727204072.02079: variable 'controller_profile' from source: play vars 12180 1727204072.02149: variable 'controller_profile' from source: play vars 12180 1727204072.02234: variable 'controller_device' from source: play vars 12180 1727204072.02300: variable 'controller_device' from source: play vars 12180 1727204072.02453: variable 'port1_profile' from source: play vars 12180 1727204072.02519: variable 'port1_profile' from source: play vars 12180 1727204072.02532: variable 'dhcp_interface1' from source: play vars 12180 1727204072.02715: variable 'dhcp_interface1' from source: play vars 12180 1727204072.02778: variable 'controller_profile' from source: play vars 12180 1727204072.02935: variable 'controller_profile' from source: play vars 12180 1727204072.02996: variable 'port2_profile' from source: play vars 12180 1727204072.03236: variable 'port2_profile' from source: play vars 12180 1727204072.03320: variable 'dhcp_interface2' from source: play vars 12180 1727204072.03391: variable 'dhcp_interface2' from source: play vars 12180 1727204072.03616: variable 'controller_profile' from source: play vars 12180 1727204072.03691: variable 'controller_profile' from source: play vars 12180 1727204072.03819: variable 'ansible_distribution' from source: facts 12180 1727204072.03874: variable '__network_rh_distros' from source: role '' defaults 12180 1727204072.03968: variable 'ansible_distribution_major_version' from source: facts 12180 1727204072.04006: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12180 1727204072.04444: variable 'ansible_distribution' from source: facts 12180 1727204072.04453: variable '__network_rh_distros' from source: role '' defaults 12180 1727204072.04463: variable 'ansible_distribution_major_version' from source: facts 12180 1727204072.04529: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12180 1727204072.04987: variable 'ansible_distribution' from source: facts 12180 1727204072.04997: variable '__network_rh_distros' from source: role '' defaults 12180 1727204072.05006: variable 'ansible_distribution_major_version' from source: facts 12180 1727204072.05088: variable 'network_provider' from source: set_fact 12180 1727204072.05195: variable 'omit' from source: magic vars 12180 1727204072.05232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204072.05296: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204072.05371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204072.05434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204072.07350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204072.07461: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204072.07475: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204072.07484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204072.07845: Set connection var ansible_pipelining to False 12180 1727204072.07861: Set connection var ansible_shell_type to sh 12180 1727204072.07876: Set connection var ansible_timeout to 10 12180 1727204072.08051: Set connection var ansible_connection to ssh 12180 1727204072.08061: Set connection var ansible_shell_executable to /bin/sh 12180 1727204072.08077: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204072.08112: variable 'ansible_shell_executable' from source: unknown 12180 1727204072.08120: variable 'ansible_connection' from source: unknown 12180 1727204072.08126: variable 'ansible_module_compression' from source: unknown 12180 1727204072.08135: variable 'ansible_shell_type' from source: unknown 12180 1727204072.08190: variable 'ansible_shell_executable' from source: unknown 12180 1727204072.08199: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204072.08208: variable 'ansible_pipelining' from source: unknown 12180 1727204072.08216: variable 'ansible_timeout' from source: unknown 12180 1727204072.08224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204072.08486: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204072.08584: variable 'omit' from source: magic vars 12180 1727204072.08595: starting attempt loop 12180 1727204072.08601: running the handler 12180 1727204072.08799: variable 'ansible_facts' from source: unknown 12180 1727204072.09577: _low_level_execute_command(): starting 12180 1727204072.09589: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204072.10429: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204072.10447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204072.10463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204072.10487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204072.10537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204072.10549: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204072.10567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204072.10587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204072.10602: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204072.10618: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204072.10631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204072.10646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204072.10662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204072.10678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204072.10689: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204072.10703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204072.10787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204072.10813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204072.10838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204072.10935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204072.12606: stdout chunk (state=3): >>>/root <<< 12180 1727204072.12709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204072.12763: stderr chunk (state=3): >>><<< 12180 1727204072.12773: stdout chunk (state=3): >>><<< 12180 1727204072.12788: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204072.12799: _low_level_execute_command(): starting 12180 1727204072.12806: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204072.1278815-14322-167800273494013 `" && echo ansible-tmp-1727204072.1278815-14322-167800273494013="` echo /root/.ansible/tmp/ansible-tmp-1727204072.1278815-14322-167800273494013 `" ) && sleep 0' 12180 1727204072.13282: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204072.13286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204072.13334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204072.13338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204072.13340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204072.13401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204072.13405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204072.13474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204072.15345: stdout chunk (state=3): >>>ansible-tmp-1727204072.1278815-14322-167800273494013=/root/.ansible/tmp/ansible-tmp-1727204072.1278815-14322-167800273494013 <<< 12180 1727204072.15485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204072.15546: stderr chunk (state=3): >>><<< 12180 1727204072.15549: stdout chunk (state=3): >>><<< 12180 1727204072.15775: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204072.1278815-14322-167800273494013=/root/.ansible/tmp/ansible-tmp-1727204072.1278815-14322-167800273494013 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204072.15779: variable 'ansible_module_compression' from source: unknown 12180 1727204072.15782: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 12180 1727204072.15785: ANSIBALLZ: Acquiring lock 12180 1727204072.15787: ANSIBALLZ: Lock acquired: 140650305861680 12180 1727204072.15789: ANSIBALLZ: Creating module 12180 1727204072.71525: ANSIBALLZ: Writing module into payload 12180 1727204072.71699: ANSIBALLZ: Writing module 12180 1727204072.71735: ANSIBALLZ: Renaming module 12180 1727204072.71739: ANSIBALLZ: Done creating module 12180 1727204072.71780: variable 'ansible_facts' from source: unknown 12180 1727204072.71998: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204072.1278815-14322-167800273494013/AnsiballZ_systemd.py 12180 1727204072.72142: Sending initial data 12180 1727204072.72146: Sent initial data (156 bytes) 12180 1727204072.73207: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204072.73216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204072.73225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204072.73241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204072.73289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204072.73293: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204072.73295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204072.73315: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204072.73318: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204072.73320: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204072.73325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204072.73337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204072.73349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204072.73355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204072.73362: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204072.73377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204072.73450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204072.73467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204072.73470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204072.73751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204072.75538: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204072.75585: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204072.75646: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpxt6i8ld1 /root/.ansible/tmp/ansible-tmp-1727204072.1278815-14322-167800273494013/AnsiballZ_systemd.py <<< 12180 1727204072.75699: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204072.78117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204072.78322: stderr chunk (state=3): >>><<< 12180 1727204072.78326: stdout chunk (state=3): >>><<< 12180 1727204072.78328: done transferring module to remote 12180 1727204072.78330: _low_level_execute_command(): starting 12180 1727204072.78333: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204072.1278815-14322-167800273494013/ /root/.ansible/tmp/ansible-tmp-1727204072.1278815-14322-167800273494013/AnsiballZ_systemd.py && sleep 0' 12180 1727204072.79198: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204072.79202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204072.79239: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204072.79242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204072.79244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 12180 1727204072.79246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204072.79319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204072.79323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204072.79327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204072.79400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204072.81129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204072.81225: stderr chunk (state=3): >>><<< 12180 1727204072.81229: stdout chunk (state=3): >>><<< 12180 1727204072.81329: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204072.81333: _low_level_execute_command(): starting 12180 1727204072.81335: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204072.1278815-14322-167800273494013/AnsiballZ_systemd.py && sleep 0' 12180 1727204072.83053: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204072.83057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204072.83161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204072.83168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204072.83170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204072.83230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204072.83302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204072.83390: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204073.08288: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "13516800", "MemoryAvailable": "infinity", "CPUUsageNSec": "417373000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "<<< 12180 1727204073.08326: stdout chunk (state=3): >>>0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12180 1727204073.09681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204073.09747: stderr chunk (state=3): >>><<< 12180 1727204073.09749: stdout chunk (state=3): >>><<< 12180 1727204073.09771: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "13516800", "MemoryAvailable": "infinity", "CPUUsageNSec": "417373000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204073.09890: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204072.1278815-14322-167800273494013/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204073.09919: _low_level_execute_command(): starting 12180 1727204073.09923: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204072.1278815-14322-167800273494013/ > /dev/null 2>&1 && sleep 0' 12180 1727204073.10665: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204073.10670: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204073.10673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204073.10675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204073.10873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204073.10876: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204073.10879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204073.10881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204073.10883: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204073.10884: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204073.10886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204073.10888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204073.10890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204073.10892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204073.10893: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204073.10895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204073.10897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204073.10899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204073.10901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204073.10995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204073.12750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204073.12822: stderr chunk (state=3): >>><<< 12180 1727204073.12841: stdout chunk (state=3): >>><<< 12180 1727204073.13176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204073.13180: handler run complete 12180 1727204073.13182: attempt loop complete, returning result 12180 1727204073.13184: _execute() done 12180 1727204073.13185: dumping result to json 12180 1727204073.13187: done dumping result, returning 12180 1727204073.13189: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-ccb1-55ae-000000000032] 12180 1727204073.13191: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000032 12180 1727204073.13323: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000032 12180 1727204073.13326: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12180 1727204073.13377: no more pending results, returning what we have 12180 1727204073.13380: results queue empty 12180 1727204073.13381: checking for any_errors_fatal 12180 1727204073.13385: done checking for any_errors_fatal 12180 1727204073.13386: checking for max_fail_percentage 12180 1727204073.13388: done checking for max_fail_percentage 12180 1727204073.13388: checking to see if all hosts have failed and the running result is not ok 12180 1727204073.13389: done checking to see if all hosts have failed 12180 1727204073.13390: getting the remaining hosts for this loop 12180 1727204073.13391: done getting the remaining hosts for this loop 12180 1727204073.13394: getting the next task for host managed-node1 12180 1727204073.13401: done getting next task for host managed-node1 12180 1727204073.13404: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12180 1727204073.13407: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204073.13415: getting variables 12180 1727204073.13417: in VariableManager get_vars() 12180 1727204073.13455: Calling all_inventory to load vars for managed-node1 12180 1727204073.13458: Calling groups_inventory to load vars for managed-node1 12180 1727204073.13460: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204073.13472: Calling all_plugins_play to load vars for managed-node1 12180 1727204073.13474: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204073.13477: Calling groups_plugins_play to load vars for managed-node1 12180 1727204073.15169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204073.16101: done with get_vars() 12180 1727204073.16118: done getting variables 12180 1727204073.16167: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:33 -0400 (0:00:01.309) 0:00:20.573 ***** 12180 1727204073.16197: entering _queue_task() for managed-node1/service 12180 1727204073.16427: worker is 1 (out of 1 available) 12180 1727204073.16443: exiting _queue_task() for managed-node1/service 12180 1727204073.16455: done queuing things up, now waiting for results queue to drain 12180 1727204073.16457: waiting for pending results... 12180 1727204073.16624: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12180 1727204073.16713: in run() - task 0affcd87-79f5-ccb1-55ae-000000000033 12180 1727204073.16726: variable 'ansible_search_path' from source: unknown 12180 1727204073.16732: variable 'ansible_search_path' from source: unknown 12180 1727204073.16760: calling self._execute() 12180 1727204073.16836: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204073.16842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204073.16851: variable 'omit' from source: magic vars 12180 1727204073.17119: variable 'ansible_distribution_major_version' from source: facts 12180 1727204073.17132: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204073.17212: variable 'network_provider' from source: set_fact 12180 1727204073.17215: Evaluated conditional (network_provider == "nm"): True 12180 1727204073.17284: variable '__network_wpa_supplicant_required' from source: role '' defaults 12180 1727204073.17347: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12180 1727204073.17466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204073.18987: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204073.19035: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204073.19061: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204073.19088: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204073.19112: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204073.19179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204073.19200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204073.19221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204073.19250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204073.19260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204073.19301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204073.19317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204073.19340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204073.19364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204073.19375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204073.19404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204073.19420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204073.19438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204073.19466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204073.19477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204073.19577: variable 'network_connections' from source: task vars 12180 1727204073.19586: variable 'controller_profile' from source: play vars 12180 1727204073.19636: variable 'controller_profile' from source: play vars 12180 1727204073.19643: variable 'controller_device' from source: play vars 12180 1727204073.19691: variable 'controller_device' from source: play vars 12180 1727204073.19699: variable 'port1_profile' from source: play vars 12180 1727204073.19742: variable 'port1_profile' from source: play vars 12180 1727204073.19748: variable 'dhcp_interface1' from source: play vars 12180 1727204073.19794: variable 'dhcp_interface1' from source: play vars 12180 1727204073.19799: variable 'controller_profile' from source: play vars 12180 1727204073.19842: variable 'controller_profile' from source: play vars 12180 1727204073.19848: variable 'port2_profile' from source: play vars 12180 1727204073.19892: variable 'port2_profile' from source: play vars 12180 1727204073.19898: variable 'dhcp_interface2' from source: play vars 12180 1727204073.19941: variable 'dhcp_interface2' from source: play vars 12180 1727204073.19946: variable 'controller_profile' from source: play vars 12180 1727204073.19991: variable 'controller_profile' from source: play vars 12180 1727204073.20042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204073.20155: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204073.20184: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204073.20208: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204073.20231: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204073.20260: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204073.20278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204073.20296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204073.20320: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204073.20359: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204073.20533: variable 'network_connections' from source: task vars 12180 1727204073.20537: variable 'controller_profile' from source: play vars 12180 1727204073.20578: variable 'controller_profile' from source: play vars 12180 1727204073.20583: variable 'controller_device' from source: play vars 12180 1727204073.20632: variable 'controller_device' from source: play vars 12180 1727204073.20635: variable 'port1_profile' from source: play vars 12180 1727204073.20678: variable 'port1_profile' from source: play vars 12180 1727204073.20685: variable 'dhcp_interface1' from source: play vars 12180 1727204073.20726: variable 'dhcp_interface1' from source: play vars 12180 1727204073.20736: variable 'controller_profile' from source: play vars 12180 1727204073.20778: variable 'controller_profile' from source: play vars 12180 1727204073.20784: variable 'port2_profile' from source: play vars 12180 1727204073.20826: variable 'port2_profile' from source: play vars 12180 1727204073.20835: variable 'dhcp_interface2' from source: play vars 12180 1727204073.20879: variable 'dhcp_interface2' from source: play vars 12180 1727204073.20885: variable 'controller_profile' from source: play vars 12180 1727204073.20927: variable 'controller_profile' from source: play vars 12180 1727204073.20960: Evaluated conditional (__network_wpa_supplicant_required): False 12180 1727204073.20964: when evaluation is False, skipping this task 12180 1727204073.20971: _execute() done 12180 1727204073.20974: dumping result to json 12180 1727204073.20976: done dumping result, returning 12180 1727204073.20981: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-ccb1-55ae-000000000033] 12180 1727204073.20987: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000033 12180 1727204073.21090: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000033 12180 1727204073.21093: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12180 1727204073.21137: no more pending results, returning what we have 12180 1727204073.21140: results queue empty 12180 1727204073.21141: checking for any_errors_fatal 12180 1727204073.21159: done checking for any_errors_fatal 12180 1727204073.21160: checking for max_fail_percentage 12180 1727204073.21161: done checking for max_fail_percentage 12180 1727204073.21162: checking to see if all hosts have failed and the running result is not ok 12180 1727204073.21163: done checking to see if all hosts have failed 12180 1727204073.21165: getting the remaining hosts for this loop 12180 1727204073.21166: done getting the remaining hosts for this loop 12180 1727204073.21170: getting the next task for host managed-node1 12180 1727204073.21180: done getting next task for host managed-node1 12180 1727204073.21185: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12180 1727204073.21191: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204073.21205: getting variables 12180 1727204073.21207: in VariableManager get_vars() 12180 1727204073.21248: Calling all_inventory to load vars for managed-node1 12180 1727204073.21251: Calling groups_inventory to load vars for managed-node1 12180 1727204073.21253: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204073.21261: Calling all_plugins_play to load vars for managed-node1 12180 1727204073.21265: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204073.21268: Calling groups_plugins_play to load vars for managed-node1 12180 1727204073.22074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204073.23014: done with get_vars() 12180 1727204073.23035: done getting variables 12180 1727204073.23081: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:33 -0400 (0:00:00.069) 0:00:20.642 ***** 12180 1727204073.23104: entering _queue_task() for managed-node1/service 12180 1727204073.23340: worker is 1 (out of 1 available) 12180 1727204073.23352: exiting _queue_task() for managed-node1/service 12180 1727204073.23367: done queuing things up, now waiting for results queue to drain 12180 1727204073.23369: waiting for pending results... 12180 1727204073.23542: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 12180 1727204073.23622: in run() - task 0affcd87-79f5-ccb1-55ae-000000000034 12180 1727204073.23634: variable 'ansible_search_path' from source: unknown 12180 1727204073.23638: variable 'ansible_search_path' from source: unknown 12180 1727204073.23671: calling self._execute() 12180 1727204073.23736: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204073.23740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204073.23748: variable 'omit' from source: magic vars 12180 1727204073.24023: variable 'ansible_distribution_major_version' from source: facts 12180 1727204073.24033: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204073.24112: variable 'network_provider' from source: set_fact 12180 1727204073.24116: Evaluated conditional (network_provider == "initscripts"): False 12180 1727204073.24118: when evaluation is False, skipping this task 12180 1727204073.24121: _execute() done 12180 1727204073.24124: dumping result to json 12180 1727204073.24126: done dumping result, returning 12180 1727204073.24136: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-ccb1-55ae-000000000034] 12180 1727204073.24139: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000034 12180 1727204073.24224: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000034 12180 1727204073.24226: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12180 1727204073.24293: no more pending results, returning what we have 12180 1727204073.24296: results queue empty 12180 1727204073.24297: checking for any_errors_fatal 12180 1727204073.24303: done checking for any_errors_fatal 12180 1727204073.24304: checking for max_fail_percentage 12180 1727204073.24305: done checking for max_fail_percentage 12180 1727204073.24306: checking to see if all hosts have failed and the running result is not ok 12180 1727204073.24307: done checking to see if all hosts have failed 12180 1727204073.24308: getting the remaining hosts for this loop 12180 1727204073.24309: done getting the remaining hosts for this loop 12180 1727204073.24312: getting the next task for host managed-node1 12180 1727204073.24318: done getting next task for host managed-node1 12180 1727204073.24322: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12180 1727204073.24325: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204073.24344: getting variables 12180 1727204073.24346: in VariableManager get_vars() 12180 1727204073.24386: Calling all_inventory to load vars for managed-node1 12180 1727204073.24388: Calling groups_inventory to load vars for managed-node1 12180 1727204073.24390: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204073.24399: Calling all_plugins_play to load vars for managed-node1 12180 1727204073.24400: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204073.24403: Calling groups_plugins_play to load vars for managed-node1 12180 1727204073.25273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204073.26196: done with get_vars() 12180 1727204073.26211: done getting variables 12180 1727204073.26256: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:33 -0400 (0:00:00.031) 0:00:20.674 ***** 12180 1727204073.26282: entering _queue_task() for managed-node1/copy 12180 1727204073.26600: worker is 1 (out of 1 available) 12180 1727204073.26611: exiting _queue_task() for managed-node1/copy 12180 1727204073.26622: done queuing things up, now waiting for results queue to drain 12180 1727204073.26624: waiting for pending results... 12180 1727204073.26749: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12180 1727204073.26879: in run() - task 0affcd87-79f5-ccb1-55ae-000000000035 12180 1727204073.26891: variable 'ansible_search_path' from source: unknown 12180 1727204073.26894: variable 'ansible_search_path' from source: unknown 12180 1727204073.26932: calling self._execute() 12180 1727204073.27030: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204073.27035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204073.27043: variable 'omit' from source: magic vars 12180 1727204073.27441: variable 'ansible_distribution_major_version' from source: facts 12180 1727204073.27454: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204073.27581: variable 'network_provider' from source: set_fact 12180 1727204073.27592: Evaluated conditional (network_provider == "initscripts"): False 12180 1727204073.27596: when evaluation is False, skipping this task 12180 1727204073.27599: _execute() done 12180 1727204073.27607: dumping result to json 12180 1727204073.27611: done dumping result, returning 12180 1727204073.27620: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-ccb1-55ae-000000000035] 12180 1727204073.27625: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000035 12180 1727204073.27718: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000035 12180 1727204073.27721: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12180 1727204073.27776: no more pending results, returning what we have 12180 1727204073.27779: results queue empty 12180 1727204073.27780: checking for any_errors_fatal 12180 1727204073.27785: done checking for any_errors_fatal 12180 1727204073.27785: checking for max_fail_percentage 12180 1727204073.27787: done checking for max_fail_percentage 12180 1727204073.27788: checking to see if all hosts have failed and the running result is not ok 12180 1727204073.27789: done checking to see if all hosts have failed 12180 1727204073.27789: getting the remaining hosts for this loop 12180 1727204073.27790: done getting the remaining hosts for this loop 12180 1727204073.27794: getting the next task for host managed-node1 12180 1727204073.27800: done getting next task for host managed-node1 12180 1727204073.27804: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12180 1727204073.27807: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204073.27821: getting variables 12180 1727204073.27823: in VariableManager get_vars() 12180 1727204073.27861: Calling all_inventory to load vars for managed-node1 12180 1727204073.27867: Calling groups_inventory to load vars for managed-node1 12180 1727204073.27870: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204073.27878: Calling all_plugins_play to load vars for managed-node1 12180 1727204073.27881: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204073.27883: Calling groups_plugins_play to load vars for managed-node1 12180 1727204073.29325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204073.30745: done with get_vars() 12180 1727204073.30765: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:33 -0400 (0:00:00.045) 0:00:20.720 ***** 12180 1727204073.30828: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 12180 1727204073.30830: Creating lock for fedora.linux_system_roles.network_connections 12180 1727204073.31062: worker is 1 (out of 1 available) 12180 1727204073.31076: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 12180 1727204073.31088: done queuing things up, now waiting for results queue to drain 12180 1727204073.31090: waiting for pending results... 12180 1727204073.31262: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12180 1727204073.31351: in run() - task 0affcd87-79f5-ccb1-55ae-000000000036 12180 1727204073.31361: variable 'ansible_search_path' from source: unknown 12180 1727204073.31366: variable 'ansible_search_path' from source: unknown 12180 1727204073.31394: calling self._execute() 12180 1727204073.31469: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204073.31473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204073.31481: variable 'omit' from source: magic vars 12180 1727204073.31755: variable 'ansible_distribution_major_version' from source: facts 12180 1727204073.31766: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204073.31772: variable 'omit' from source: magic vars 12180 1727204073.31810: variable 'omit' from source: magic vars 12180 1727204073.31923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204073.33469: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204073.33516: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204073.33544: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204073.33572: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204073.33595: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204073.33654: variable 'network_provider' from source: set_fact 12180 1727204073.33749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204073.33781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204073.33804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204073.33829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204073.33842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204073.33895: variable 'omit' from source: magic vars 12180 1727204073.33977: variable 'omit' from source: magic vars 12180 1727204073.34050: variable 'network_connections' from source: task vars 12180 1727204073.34060: variable 'controller_profile' from source: play vars 12180 1727204073.34104: variable 'controller_profile' from source: play vars 12180 1727204073.34110: variable 'controller_device' from source: play vars 12180 1727204073.34158: variable 'controller_device' from source: play vars 12180 1727204073.34168: variable 'port1_profile' from source: play vars 12180 1727204073.34209: variable 'port1_profile' from source: play vars 12180 1727204073.34215: variable 'dhcp_interface1' from source: play vars 12180 1727204073.34262: variable 'dhcp_interface1' from source: play vars 12180 1727204073.34269: variable 'controller_profile' from source: play vars 12180 1727204073.34311: variable 'controller_profile' from source: play vars 12180 1727204073.34317: variable 'port2_profile' from source: play vars 12180 1727204073.34365: variable 'port2_profile' from source: play vars 12180 1727204073.34371: variable 'dhcp_interface2' from source: play vars 12180 1727204073.34414: variable 'dhcp_interface2' from source: play vars 12180 1727204073.34420: variable 'controller_profile' from source: play vars 12180 1727204073.34466: variable 'controller_profile' from source: play vars 12180 1727204073.34593: variable 'omit' from source: magic vars 12180 1727204073.34600: variable '__lsr_ansible_managed' from source: task vars 12180 1727204073.34645: variable '__lsr_ansible_managed' from source: task vars 12180 1727204073.34772: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12180 1727204073.34918: Loaded config def from plugin (lookup/template) 12180 1727204073.34921: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12180 1727204073.34945: File lookup term: get_ansible_managed.j2 12180 1727204073.34948: variable 'ansible_search_path' from source: unknown 12180 1727204073.34951: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12180 1727204073.34965: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12180 1727204073.34977: variable 'ansible_search_path' from source: unknown 12180 1727204073.38468: variable 'ansible_managed' from source: unknown 12180 1727204073.38556: variable 'omit' from source: magic vars 12180 1727204073.38582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204073.38601: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204073.38615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204073.38628: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204073.38638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204073.38659: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204073.38662: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204073.38670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204073.38731: Set connection var ansible_pipelining to False 12180 1727204073.38736: Set connection var ansible_shell_type to sh 12180 1727204073.38742: Set connection var ansible_timeout to 10 12180 1727204073.38747: Set connection var ansible_connection to ssh 12180 1727204073.38752: Set connection var ansible_shell_executable to /bin/sh 12180 1727204073.38763: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204073.38781: variable 'ansible_shell_executable' from source: unknown 12180 1727204073.38784: variable 'ansible_connection' from source: unknown 12180 1727204073.38791: variable 'ansible_module_compression' from source: unknown 12180 1727204073.38793: variable 'ansible_shell_type' from source: unknown 12180 1727204073.38795: variable 'ansible_shell_executable' from source: unknown 12180 1727204073.38798: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204073.38801: variable 'ansible_pipelining' from source: unknown 12180 1727204073.38803: variable 'ansible_timeout' from source: unknown 12180 1727204073.38805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204073.38901: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204073.38906: variable 'omit' from source: magic vars 12180 1727204073.38914: starting attempt loop 12180 1727204073.38917: running the handler 12180 1727204073.38927: _low_level_execute_command(): starting 12180 1727204073.38935: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204073.39452: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204073.39463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204073.39494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204073.39507: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204073.39573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204073.39581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204073.39585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204073.39653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204073.41290: stdout chunk (state=3): >>>/root <<< 12180 1727204073.41395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204073.41458: stderr chunk (state=3): >>><<< 12180 1727204073.41461: stdout chunk (state=3): >>><<< 12180 1727204073.41487: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204073.41497: _low_level_execute_command(): starting 12180 1727204073.41503: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204073.414874-14513-250206404286361 `" && echo ansible-tmp-1727204073.414874-14513-250206404286361="` echo /root/.ansible/tmp/ansible-tmp-1727204073.414874-14513-250206404286361 `" ) && sleep 0' 12180 1727204073.41972: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204073.41978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204073.42012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204073.42025: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204073.42036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204073.42086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204073.42098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204073.42160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204073.44008: stdout chunk (state=3): >>>ansible-tmp-1727204073.414874-14513-250206404286361=/root/.ansible/tmp/ansible-tmp-1727204073.414874-14513-250206404286361 <<< 12180 1727204073.44121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204073.44180: stderr chunk (state=3): >>><<< 12180 1727204073.44183: stdout chunk (state=3): >>><<< 12180 1727204073.44199: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204073.414874-14513-250206404286361=/root/.ansible/tmp/ansible-tmp-1727204073.414874-14513-250206404286361 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204073.44244: variable 'ansible_module_compression' from source: unknown 12180 1727204073.44288: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 12180 1727204073.44291: ANSIBALLZ: Acquiring lock 12180 1727204073.44294: ANSIBALLZ: Lock acquired: 140650302647968 12180 1727204073.44296: ANSIBALLZ: Creating module 12180 1727204073.58697: ANSIBALLZ: Writing module into payload 12180 1727204073.59198: ANSIBALLZ: Writing module 12180 1727204073.59203: ANSIBALLZ: Renaming module 12180 1727204073.59209: ANSIBALLZ: Done creating module 12180 1727204073.59238: variable 'ansible_facts' from source: unknown 12180 1727204073.59342: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204073.414874-14513-250206404286361/AnsiballZ_network_connections.py 12180 1727204073.59490: Sending initial data 12180 1727204073.59493: Sent initial data (167 bytes) 12180 1727204073.60518: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204073.60522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204073.60524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204073.60527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204073.60608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204073.60612: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204073.60615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204073.60617: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204073.60619: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204073.60622: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204073.60624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204073.60635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204073.60647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204073.60655: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204073.60661: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204073.60674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204073.60747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204073.60763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204073.60775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204073.60857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204073.62628: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204073.62685: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204073.62736: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmp4m5nlj33 /root/.ansible/tmp/ansible-tmp-1727204073.414874-14513-250206404286361/AnsiballZ_network_connections.py <<< 12180 1727204073.62788: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204073.64537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204073.64621: stderr chunk (state=3): >>><<< 12180 1727204073.64625: stdout chunk (state=3): >>><<< 12180 1727204073.64651: done transferring module to remote 12180 1727204073.64662: _low_level_execute_command(): starting 12180 1727204073.64673: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204073.414874-14513-250206404286361/ /root/.ansible/tmp/ansible-tmp-1727204073.414874-14513-250206404286361/AnsiballZ_network_connections.py && sleep 0' 12180 1727204073.65342: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204073.65348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204073.65387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204073.65393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 12180 1727204073.65408: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204073.65413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 12180 1727204073.65419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204073.65496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204073.65519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204073.65540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204073.65625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204073.67427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204073.67432: stdout chunk (state=3): >>><<< 12180 1727204073.67434: stderr chunk (state=3): >>><<< 12180 1727204073.67533: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204073.67537: _low_level_execute_command(): starting 12180 1727204073.67539: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204073.414874-14513-250206404286361/AnsiballZ_network_connections.py && sleep 0' 12180 1727204073.68074: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204073.68081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204073.68127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204073.68130: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204073.68132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204073.68179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204073.68191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204073.68268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204074.15219: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7454a8c0-a94c-487e-947e-0611b087626b\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 6724accd-c5ba-48f8-ba5f-9a64052000e9\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 24de85ff-9c9d-4d15-a711-da166680223b\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7454a8c0-a94c-487e-947e-0611b087626b (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 6724accd-c5ba-48f8-ba5f-9a64052000e9 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 24de85ff-9c9d-4d15-a711-da166680223b (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12180 1727204074.16946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204074.17044: stderr chunk (state=3): >>><<< 12180 1727204074.17048: stdout chunk (state=3): >>><<< 12180 1727204074.17223: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7454a8c0-a94c-487e-947e-0611b087626b\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 6724accd-c5ba-48f8-ba5f-9a64052000e9\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 24de85ff-9c9d-4d15-a711-da166680223b\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7454a8c0-a94c-487e-947e-0611b087626b (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 6724accd-c5ba-48f8-ba5f-9a64052000e9 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 24de85ff-9c9d-4d15-a711-da166680223b (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204074.17227: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'deprecated-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'master': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'master': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204073.414874-14513-250206404286361/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204074.17235: _low_level_execute_command(): starting 12180 1727204074.17237: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204073.414874-14513-250206404286361/ > /dev/null 2>&1 && sleep 0' 12180 1727204074.17848: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204074.17869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204074.17889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204074.17907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204074.17949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204074.17962: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204074.17985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204074.18004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204074.18018: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204074.18031: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204074.18043: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204074.18058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204074.18078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204074.18097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204074.18110: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204074.18124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204074.18202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204074.18230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204074.18247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204074.18338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204074.20263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204074.20330: stderr chunk (state=3): >>><<< 12180 1727204074.20334: stdout chunk (state=3): >>><<< 12180 1727204074.20474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204074.20478: handler run complete 12180 1727204074.20480: attempt loop complete, returning result 12180 1727204074.20482: _execute() done 12180 1727204074.20484: dumping result to json 12180 1727204074.20485: done dumping result, returning 12180 1727204074.20487: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-ccb1-55ae-000000000036] 12180 1727204074.20489: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000036 12180 1727204074.20666: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000036 12180 1727204074.20670: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "deprecated-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "interface_name": "test1", "master": "bond0", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "interface_name": "test2", "master": "bond0", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7454a8c0-a94c-487e-947e-0611b087626b [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 6724accd-c5ba-48f8-ba5f-9a64052000e9 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 24de85ff-9c9d-4d15-a711-da166680223b [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7454a8c0-a94c-487e-947e-0611b087626b (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 6724accd-c5ba-48f8-ba5f-9a64052000e9 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 24de85ff-9c9d-4d15-a711-da166680223b (not-active) 12180 1727204074.20828: no more pending results, returning what we have 12180 1727204074.20831: results queue empty 12180 1727204074.20832: checking for any_errors_fatal 12180 1727204074.20838: done checking for any_errors_fatal 12180 1727204074.20839: checking for max_fail_percentage 12180 1727204074.20840: done checking for max_fail_percentage 12180 1727204074.20841: checking to see if all hosts have failed and the running result is not ok 12180 1727204074.20842: done checking to see if all hosts have failed 12180 1727204074.20843: getting the remaining hosts for this loop 12180 1727204074.20844: done getting the remaining hosts for this loop 12180 1727204074.20849: getting the next task for host managed-node1 12180 1727204074.20855: done getting next task for host managed-node1 12180 1727204074.20859: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12180 1727204074.20862: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204074.20880: getting variables 12180 1727204074.20882: in VariableManager get_vars() 12180 1727204074.20925: Calling all_inventory to load vars for managed-node1 12180 1727204074.20928: Calling groups_inventory to load vars for managed-node1 12180 1727204074.20931: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204074.20942: Calling all_plugins_play to load vars for managed-node1 12180 1727204074.20945: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204074.20948: Calling groups_plugins_play to load vars for managed-node1 12180 1727204074.22724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204074.25337: done with get_vars() 12180 1727204074.25369: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:34 -0400 (0:00:00.946) 0:00:21.666 ***** 12180 1727204074.25463: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 12180 1727204074.25467: Creating lock for fedora.linux_system_roles.network_state 12180 1727204074.25804: worker is 1 (out of 1 available) 12180 1727204074.25818: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 12180 1727204074.25830: done queuing things up, now waiting for results queue to drain 12180 1727204074.25832: waiting for pending results... 12180 1727204074.26120: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 12180 1727204074.26256: in run() - task 0affcd87-79f5-ccb1-55ae-000000000037 12180 1727204074.26311: variable 'ansible_search_path' from source: unknown 12180 1727204074.26319: variable 'ansible_search_path' from source: unknown 12180 1727204074.26423: calling self._execute() 12180 1727204074.26602: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204074.26615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204074.26632: variable 'omit' from source: magic vars 12180 1727204074.27016: variable 'ansible_distribution_major_version' from source: facts 12180 1727204074.27038: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204074.27171: variable 'network_state' from source: role '' defaults 12180 1727204074.27187: Evaluated conditional (network_state != {}): False 12180 1727204074.27194: when evaluation is False, skipping this task 12180 1727204074.27201: _execute() done 12180 1727204074.27210: dumping result to json 12180 1727204074.27217: done dumping result, returning 12180 1727204074.27228: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-ccb1-55ae-000000000037] 12180 1727204074.27239: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000037 12180 1727204074.27349: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000037 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12180 1727204074.27406: no more pending results, returning what we have 12180 1727204074.27410: results queue empty 12180 1727204074.27412: checking for any_errors_fatal 12180 1727204074.27423: done checking for any_errors_fatal 12180 1727204074.27424: checking for max_fail_percentage 12180 1727204074.27426: done checking for max_fail_percentage 12180 1727204074.27427: checking to see if all hosts have failed and the running result is not ok 12180 1727204074.27428: done checking to see if all hosts have failed 12180 1727204074.27429: getting the remaining hosts for this loop 12180 1727204074.27430: done getting the remaining hosts for this loop 12180 1727204074.27434: getting the next task for host managed-node1 12180 1727204074.27442: done getting next task for host managed-node1 12180 1727204074.27446: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12180 1727204074.27449: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204074.27467: getting variables 12180 1727204074.27469: in VariableManager get_vars() 12180 1727204074.27516: Calling all_inventory to load vars for managed-node1 12180 1727204074.27519: Calling groups_inventory to load vars for managed-node1 12180 1727204074.27522: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204074.27534: Calling all_plugins_play to load vars for managed-node1 12180 1727204074.27537: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204074.27540: Calling groups_plugins_play to load vars for managed-node1 12180 1727204074.28481: WORKER PROCESS EXITING 12180 1727204074.29534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204074.32242: done with get_vars() 12180 1727204074.32279: done getting variables 12180 1727204074.32368: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:34 -0400 (0:00:00.069) 0:00:21.736 ***** 12180 1727204074.32413: entering _queue_task() for managed-node1/debug 12180 1727204074.32891: worker is 1 (out of 1 available) 12180 1727204074.32903: exiting _queue_task() for managed-node1/debug 12180 1727204074.32915: done queuing things up, now waiting for results queue to drain 12180 1727204074.32916: waiting for pending results... 12180 1727204074.33260: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12180 1727204074.33491: in run() - task 0affcd87-79f5-ccb1-55ae-000000000038 12180 1727204074.33604: variable 'ansible_search_path' from source: unknown 12180 1727204074.33610: variable 'ansible_search_path' from source: unknown 12180 1727204074.33663: calling self._execute() 12180 1727204074.33766: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204074.33772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204074.33783: variable 'omit' from source: magic vars 12180 1727204074.34349: variable 'ansible_distribution_major_version' from source: facts 12180 1727204074.34353: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204074.34356: variable 'omit' from source: magic vars 12180 1727204074.34359: variable 'omit' from source: magic vars 12180 1727204074.34409: variable 'omit' from source: magic vars 12180 1727204074.34454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204074.34496: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204074.34521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204074.34543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204074.34554: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204074.34638: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204074.34642: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204074.34644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204074.35247: Set connection var ansible_pipelining to False 12180 1727204074.35250: Set connection var ansible_shell_type to sh 12180 1727204074.35256: Set connection var ansible_timeout to 10 12180 1727204074.35261: Set connection var ansible_connection to ssh 12180 1727204074.35271: Set connection var ansible_shell_executable to /bin/sh 12180 1727204074.35274: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204074.35313: variable 'ansible_shell_executable' from source: unknown 12180 1727204074.35316: variable 'ansible_connection' from source: unknown 12180 1727204074.35319: variable 'ansible_module_compression' from source: unknown 12180 1727204074.35321: variable 'ansible_shell_type' from source: unknown 12180 1727204074.35324: variable 'ansible_shell_executable' from source: unknown 12180 1727204074.35326: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204074.35328: variable 'ansible_pipelining' from source: unknown 12180 1727204074.35334: variable 'ansible_timeout' from source: unknown 12180 1727204074.35338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204074.35579: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204074.35583: variable 'omit' from source: magic vars 12180 1727204074.35585: starting attempt loop 12180 1727204074.35588: running the handler 12180 1727204074.35741: variable '__network_connections_result' from source: set_fact 12180 1727204074.35807: handler run complete 12180 1727204074.35824: attempt loop complete, returning result 12180 1727204074.35827: _execute() done 12180 1727204074.35830: dumping result to json 12180 1727204074.35845: done dumping result, returning 12180 1727204074.35853: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-ccb1-55ae-000000000038] 12180 1727204074.35859: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000038 ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7454a8c0-a94c-487e-947e-0611b087626b", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 6724accd-c5ba-48f8-ba5f-9a64052000e9", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 24de85ff-9c9d-4d15-a711-da166680223b", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7454a8c0-a94c-487e-947e-0611b087626b (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 6724accd-c5ba-48f8-ba5f-9a64052000e9 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 24de85ff-9c9d-4d15-a711-da166680223b (not-active)" ] } 12180 1727204074.36024: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000038 12180 1727204074.36030: WORKER PROCESS EXITING 12180 1727204074.36039: no more pending results, returning what we have 12180 1727204074.36043: results queue empty 12180 1727204074.36044: checking for any_errors_fatal 12180 1727204074.36051: done checking for any_errors_fatal 12180 1727204074.36052: checking for max_fail_percentage 12180 1727204074.36054: done checking for max_fail_percentage 12180 1727204074.36055: checking to see if all hosts have failed and the running result is not ok 12180 1727204074.36056: done checking to see if all hosts have failed 12180 1727204074.36057: getting the remaining hosts for this loop 12180 1727204074.36059: done getting the remaining hosts for this loop 12180 1727204074.36063: getting the next task for host managed-node1 12180 1727204074.36072: done getting next task for host managed-node1 12180 1727204074.36076: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12180 1727204074.36080: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204074.36093: getting variables 12180 1727204074.36095: in VariableManager get_vars() 12180 1727204074.36145: Calling all_inventory to load vars for managed-node1 12180 1727204074.36149: Calling groups_inventory to load vars for managed-node1 12180 1727204074.36152: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204074.36166: Calling all_plugins_play to load vars for managed-node1 12180 1727204074.36169: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204074.36172: Calling groups_plugins_play to load vars for managed-node1 12180 1727204074.38042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204074.40061: done with get_vars() 12180 1727204074.40091: done getting variables 12180 1727204074.40152: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:34 -0400 (0:00:00.077) 0:00:21.813 ***** 12180 1727204074.40200: entering _queue_task() for managed-node1/debug 12180 1727204074.40923: worker is 1 (out of 1 available) 12180 1727204074.40937: exiting _queue_task() for managed-node1/debug 12180 1727204074.40948: done queuing things up, now waiting for results queue to drain 12180 1727204074.40950: waiting for pending results... 12180 1727204074.41359: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12180 1727204074.41521: in run() - task 0affcd87-79f5-ccb1-55ae-000000000039 12180 1727204074.41547: variable 'ansible_search_path' from source: unknown 12180 1727204074.41556: variable 'ansible_search_path' from source: unknown 12180 1727204074.41603: calling self._execute() 12180 1727204074.41710: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204074.41721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204074.41738: variable 'omit' from source: magic vars 12180 1727204074.42231: variable 'ansible_distribution_major_version' from source: facts 12180 1727204074.42251: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204074.42255: variable 'omit' from source: magic vars 12180 1727204074.42304: variable 'omit' from source: magic vars 12180 1727204074.42332: variable 'omit' from source: magic vars 12180 1727204074.42371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204074.42400: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204074.42417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204074.42432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204074.42440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204074.42464: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204074.42468: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204074.42472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204074.42543: Set connection var ansible_pipelining to False 12180 1727204074.42547: Set connection var ansible_shell_type to sh 12180 1727204074.42551: Set connection var ansible_timeout to 10 12180 1727204074.42556: Set connection var ansible_connection to ssh 12180 1727204074.42562: Set connection var ansible_shell_executable to /bin/sh 12180 1727204074.42568: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204074.42590: variable 'ansible_shell_executable' from source: unknown 12180 1727204074.42593: variable 'ansible_connection' from source: unknown 12180 1727204074.42595: variable 'ansible_module_compression' from source: unknown 12180 1727204074.42597: variable 'ansible_shell_type' from source: unknown 12180 1727204074.42600: variable 'ansible_shell_executable' from source: unknown 12180 1727204074.42602: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204074.42604: variable 'ansible_pipelining' from source: unknown 12180 1727204074.42607: variable 'ansible_timeout' from source: unknown 12180 1727204074.42616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204074.42715: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204074.42725: variable 'omit' from source: magic vars 12180 1727204074.42732: starting attempt loop 12180 1727204074.42736: running the handler 12180 1727204074.42771: variable '__network_connections_result' from source: set_fact 12180 1727204074.42836: variable '__network_connections_result' from source: set_fact 12180 1727204074.42945: handler run complete 12180 1727204074.42966: attempt loop complete, returning result 12180 1727204074.42969: _execute() done 12180 1727204074.42972: dumping result to json 12180 1727204074.42977: done dumping result, returning 12180 1727204074.42984: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-ccb1-55ae-000000000039] 12180 1727204074.42989: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000039 ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "deprecated-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "interface_name": "test1", "master": "bond0", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "interface_name": "test2", "master": "bond0", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7454a8c0-a94c-487e-947e-0611b087626b\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 6724accd-c5ba-48f8-ba5f-9a64052000e9\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 24de85ff-9c9d-4d15-a711-da166680223b\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7454a8c0-a94c-487e-947e-0611b087626b (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 6724accd-c5ba-48f8-ba5f-9a64052000e9 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 24de85ff-9c9d-4d15-a711-da166680223b (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7454a8c0-a94c-487e-947e-0611b087626b", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 6724accd-c5ba-48f8-ba5f-9a64052000e9", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 24de85ff-9c9d-4d15-a711-da166680223b", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7454a8c0-a94c-487e-947e-0611b087626b (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 6724accd-c5ba-48f8-ba5f-9a64052000e9 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 24de85ff-9c9d-4d15-a711-da166680223b (not-active)" ] } } 12180 1727204074.43193: no more pending results, returning what we have 12180 1727204074.43196: results queue empty 12180 1727204074.43197: checking for any_errors_fatal 12180 1727204074.43201: done checking for any_errors_fatal 12180 1727204074.43202: checking for max_fail_percentage 12180 1727204074.43209: done checking for max_fail_percentage 12180 1727204074.43210: checking to see if all hosts have failed and the running result is not ok 12180 1727204074.43211: done checking to see if all hosts have failed 12180 1727204074.43212: getting the remaining hosts for this loop 12180 1727204074.43213: done getting the remaining hosts for this loop 12180 1727204074.43216: getting the next task for host managed-node1 12180 1727204074.43221: done getting next task for host managed-node1 12180 1727204074.43225: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12180 1727204074.43227: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204074.43239: getting variables 12180 1727204074.43241: in VariableManager get_vars() 12180 1727204074.43292: Calling all_inventory to load vars for managed-node1 12180 1727204074.43295: Calling groups_inventory to load vars for managed-node1 12180 1727204074.43298: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204074.43306: Calling all_plugins_play to load vars for managed-node1 12180 1727204074.43309: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204074.43311: Calling groups_plugins_play to load vars for managed-node1 12180 1727204074.43893: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000039 12180 1727204074.43897: WORKER PROCESS EXITING 12180 1727204074.45660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204074.47300: done with get_vars() 12180 1727204074.47333: done getting variables 12180 1727204074.47404: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:34 -0400 (0:00:00.072) 0:00:21.886 ***** 12180 1727204074.47439: entering _queue_task() for managed-node1/debug 12180 1727204074.47763: worker is 1 (out of 1 available) 12180 1727204074.47775: exiting _queue_task() for managed-node1/debug 12180 1727204074.47787: done queuing things up, now waiting for results queue to drain 12180 1727204074.47788: waiting for pending results... 12180 1727204074.48084: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12180 1727204074.48375: in run() - task 0affcd87-79f5-ccb1-55ae-00000000003a 12180 1727204074.48379: variable 'ansible_search_path' from source: unknown 12180 1727204074.48382: variable 'ansible_search_path' from source: unknown 12180 1727204074.48385: calling self._execute() 12180 1727204074.48387: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204074.48391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204074.48393: variable 'omit' from source: magic vars 12180 1727204074.48741: variable 'ansible_distribution_major_version' from source: facts 12180 1727204074.48752: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204074.48877: variable 'network_state' from source: role '' defaults 12180 1727204074.48893: Evaluated conditional (network_state != {}): False 12180 1727204074.48896: when evaluation is False, skipping this task 12180 1727204074.48899: _execute() done 12180 1727204074.48901: dumping result to json 12180 1727204074.48904: done dumping result, returning 12180 1727204074.48911: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-ccb1-55ae-00000000003a] 12180 1727204074.48917: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000003a skipping: [managed-node1] => { "false_condition": "network_state != {}" } 12180 1727204074.49049: no more pending results, returning what we have 12180 1727204074.49053: results queue empty 12180 1727204074.49055: checking for any_errors_fatal 12180 1727204074.49067: done checking for any_errors_fatal 12180 1727204074.49068: checking for max_fail_percentage 12180 1727204074.49070: done checking for max_fail_percentage 12180 1727204074.49071: checking to see if all hosts have failed and the running result is not ok 12180 1727204074.49071: done checking to see if all hosts have failed 12180 1727204074.49072: getting the remaining hosts for this loop 12180 1727204074.49073: done getting the remaining hosts for this loop 12180 1727204074.49077: getting the next task for host managed-node1 12180 1727204074.49084: done getting next task for host managed-node1 12180 1727204074.49088: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12180 1727204074.49091: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204074.49109: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000003a 12180 1727204074.49115: WORKER PROCESS EXITING 12180 1727204074.49122: getting variables 12180 1727204074.49124: in VariableManager get_vars() 12180 1727204074.49173: Calling all_inventory to load vars for managed-node1 12180 1727204074.49177: Calling groups_inventory to load vars for managed-node1 12180 1727204074.49179: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204074.49193: Calling all_plugins_play to load vars for managed-node1 12180 1727204074.49196: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204074.49199: Calling groups_plugins_play to load vars for managed-node1 12180 1727204074.50816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204074.52571: done with get_vars() 12180 1727204074.52596: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:34 -0400 (0:00:00.052) 0:00:21.938 ***** 12180 1727204074.52695: entering _queue_task() for managed-node1/ping 12180 1727204074.52697: Creating lock for ping 12180 1727204074.53022: worker is 1 (out of 1 available) 12180 1727204074.53034: exiting _queue_task() for managed-node1/ping 12180 1727204074.53046: done queuing things up, now waiting for results queue to drain 12180 1727204074.53047: waiting for pending results... 12180 1727204074.53357: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 12180 1727204074.53487: in run() - task 0affcd87-79f5-ccb1-55ae-00000000003b 12180 1727204074.53505: variable 'ansible_search_path' from source: unknown 12180 1727204074.53508: variable 'ansible_search_path' from source: unknown 12180 1727204074.53547: calling self._execute() 12180 1727204074.53638: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204074.53641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204074.53653: variable 'omit' from source: magic vars 12180 1727204074.54044: variable 'ansible_distribution_major_version' from source: facts 12180 1727204074.54057: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204074.54063: variable 'omit' from source: magic vars 12180 1727204074.54125: variable 'omit' from source: magic vars 12180 1727204074.54161: variable 'omit' from source: magic vars 12180 1727204074.54207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204074.54243: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204074.54269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204074.54286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204074.54297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204074.54327: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204074.54333: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204074.54336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204074.54438: Set connection var ansible_pipelining to False 12180 1727204074.54442: Set connection var ansible_shell_type to sh 12180 1727204074.54448: Set connection var ansible_timeout to 10 12180 1727204074.54451: Set connection var ansible_connection to ssh 12180 1727204074.54458: Set connection var ansible_shell_executable to /bin/sh 12180 1727204074.54463: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204074.54498: variable 'ansible_shell_executable' from source: unknown 12180 1727204074.54501: variable 'ansible_connection' from source: unknown 12180 1727204074.54504: variable 'ansible_module_compression' from source: unknown 12180 1727204074.54507: variable 'ansible_shell_type' from source: unknown 12180 1727204074.54509: variable 'ansible_shell_executable' from source: unknown 12180 1727204074.54511: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204074.54515: variable 'ansible_pipelining' from source: unknown 12180 1727204074.54517: variable 'ansible_timeout' from source: unknown 12180 1727204074.54522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204074.54725: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204074.54735: variable 'omit' from source: magic vars 12180 1727204074.54740: starting attempt loop 12180 1727204074.54743: running the handler 12180 1727204074.54757: _low_level_execute_command(): starting 12180 1727204074.54767: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204074.55535: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204074.55548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204074.55569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204074.55582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204074.55623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204074.55633: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204074.55641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204074.55654: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204074.55666: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204074.55679: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204074.55685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204074.55696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204074.55709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204074.55715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204074.55722: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204074.55734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204074.55808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204074.55832: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204074.55843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204074.55933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204074.57589: stdout chunk (state=3): >>>/root <<< 12180 1727204074.57721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204074.57763: stderr chunk (state=3): >>><<< 12180 1727204074.57768: stdout chunk (state=3): >>><<< 12180 1727204074.57784: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204074.57796: _low_level_execute_command(): starting 12180 1727204074.57802: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204074.57785-14714-44419263150453 `" && echo ansible-tmp-1727204074.57785-14714-44419263150453="` echo /root/.ansible/tmp/ansible-tmp-1727204074.57785-14714-44419263150453 `" ) && sleep 0' 12180 1727204074.58459: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204074.58467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204074.58552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204074.58581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204074.58610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204074.58705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204074.60567: stdout chunk (state=3): >>>ansible-tmp-1727204074.57785-14714-44419263150453=/root/.ansible/tmp/ansible-tmp-1727204074.57785-14714-44419263150453 <<< 12180 1727204074.60685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204074.60745: stderr chunk (state=3): >>><<< 12180 1727204074.60749: stdout chunk (state=3): >>><<< 12180 1727204074.60768: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204074.57785-14714-44419263150453=/root/.ansible/tmp/ansible-tmp-1727204074.57785-14714-44419263150453 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204074.60811: variable 'ansible_module_compression' from source: unknown 12180 1727204074.60848: ANSIBALLZ: Using lock for ping 12180 1727204074.60851: ANSIBALLZ: Acquiring lock 12180 1727204074.60854: ANSIBALLZ: Lock acquired: 140650302764528 12180 1727204074.60856: ANSIBALLZ: Creating module 12180 1727204074.72259: ANSIBALLZ: Writing module into payload 12180 1727204074.72306: ANSIBALLZ: Writing module 12180 1727204074.72327: ANSIBALLZ: Renaming module 12180 1727204074.72334: ANSIBALLZ: Done creating module 12180 1727204074.72348: variable 'ansible_facts' from source: unknown 12180 1727204074.72392: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204074.57785-14714-44419263150453/AnsiballZ_ping.py 12180 1727204074.72503: Sending initial data 12180 1727204074.72507: Sent initial data (150 bytes) 12180 1727204074.73212: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204074.73224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204074.73252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204074.73267: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204074.73322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204074.73333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204074.73401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204074.75183: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204074.75232: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204074.75283: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpbvhdpedi /root/.ansible/tmp/ansible-tmp-1727204074.57785-14714-44419263150453/AnsiballZ_ping.py <<< 12180 1727204074.75334: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204074.76162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204074.76279: stderr chunk (state=3): >>><<< 12180 1727204074.76286: stdout chunk (state=3): >>><<< 12180 1727204074.76304: done transferring module to remote 12180 1727204074.76313: _low_level_execute_command(): starting 12180 1727204074.76320: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204074.57785-14714-44419263150453/ /root/.ansible/tmp/ansible-tmp-1727204074.57785-14714-44419263150453/AnsiballZ_ping.py && sleep 0' 12180 1727204074.76788: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204074.76794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204074.76827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204074.76839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 12180 1727204074.76850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204074.76907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204074.76911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204074.76927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204074.76989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204074.78704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204074.78755: stderr chunk (state=3): >>><<< 12180 1727204074.78759: stdout chunk (state=3): >>><<< 12180 1727204074.78775: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204074.78778: _low_level_execute_command(): starting 12180 1727204074.78784: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204074.57785-14714-44419263150453/AnsiballZ_ping.py && sleep 0' 12180 1727204074.79245: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204074.79249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204074.79285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204074.79297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204074.79352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204074.79365: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204074.79431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204074.92478: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12180 1727204074.93585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204074.93589: stdout chunk (state=3): >>><<< 12180 1727204074.93595: stderr chunk (state=3): >>><<< 12180 1727204074.93615: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204074.93641: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204074.57785-14714-44419263150453/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204074.93649: _low_level_execute_command(): starting 12180 1727204074.93654: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204074.57785-14714-44419263150453/ > /dev/null 2>&1 && sleep 0' 12180 1727204074.94526: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204074.94535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204074.94577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204074.94582: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204074.94597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204074.94603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204074.94676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204074.94689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204074.94695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204074.94779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204074.96642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204074.96646: stderr chunk (state=3): >>><<< 12180 1727204074.96651: stdout chunk (state=3): >>><<< 12180 1727204074.96673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204074.96679: handler run complete 12180 1727204074.96698: attempt loop complete, returning result 12180 1727204074.96701: _execute() done 12180 1727204074.96703: dumping result to json 12180 1727204074.96706: done dumping result, returning 12180 1727204074.96717: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-ccb1-55ae-00000000003b] 12180 1727204074.96722: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000003b 12180 1727204074.96818: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000003b 12180 1727204074.96821: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 12180 1727204074.96932: no more pending results, returning what we have 12180 1727204074.96936: results queue empty 12180 1727204074.96937: checking for any_errors_fatal 12180 1727204074.96943: done checking for any_errors_fatal 12180 1727204074.96944: checking for max_fail_percentage 12180 1727204074.96946: done checking for max_fail_percentage 12180 1727204074.96947: checking to see if all hosts have failed and the running result is not ok 12180 1727204074.96948: done checking to see if all hosts have failed 12180 1727204074.96949: getting the remaining hosts for this loop 12180 1727204074.96950: done getting the remaining hosts for this loop 12180 1727204074.96955: getting the next task for host managed-node1 12180 1727204074.96967: done getting next task for host managed-node1 12180 1727204074.96970: ^ task is: TASK: meta (role_complete) 12180 1727204074.96973: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204074.96986: getting variables 12180 1727204074.96988: in VariableManager get_vars() 12180 1727204074.97035: Calling all_inventory to load vars for managed-node1 12180 1727204074.97038: Calling groups_inventory to load vars for managed-node1 12180 1727204074.97041: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204074.97053: Calling all_plugins_play to load vars for managed-node1 12180 1727204074.97056: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204074.97059: Calling groups_plugins_play to load vars for managed-node1 12180 1727204074.98758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204075.00506: done with get_vars() 12180 1727204075.00544: done getting variables 12180 1727204075.00645: done queuing things up, now waiting for results queue to drain 12180 1727204075.00647: results queue empty 12180 1727204075.00648: checking for any_errors_fatal 12180 1727204075.00651: done checking for any_errors_fatal 12180 1727204075.00652: checking for max_fail_percentage 12180 1727204075.00653: done checking for max_fail_percentage 12180 1727204075.00654: checking to see if all hosts have failed and the running result is not ok 12180 1727204075.00654: done checking to see if all hosts have failed 12180 1727204075.00655: getting the remaining hosts for this loop 12180 1727204075.00656: done getting the remaining hosts for this loop 12180 1727204075.00658: getting the next task for host managed-node1 12180 1727204075.00668: done getting next task for host managed-node1 12180 1727204075.00672: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12180 1727204075.00674: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204075.00677: getting variables 12180 1727204075.00678: in VariableManager get_vars() 12180 1727204075.00693: Calling all_inventory to load vars for managed-node1 12180 1727204075.00695: Calling groups_inventory to load vars for managed-node1 12180 1727204075.00697: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204075.00702: Calling all_plugins_play to load vars for managed-node1 12180 1727204075.00704: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204075.00707: Calling groups_plugins_play to load vars for managed-node1 12180 1727204075.02089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204075.03824: done with get_vars() 12180 1727204075.03861: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:35 -0400 (0:00:00.512) 0:00:22.451 ***** 12180 1727204075.03943: entering _queue_task() for managed-node1/include_tasks 12180 1727204075.04338: worker is 1 (out of 1 available) 12180 1727204075.04354: exiting _queue_task() for managed-node1/include_tasks 12180 1727204075.04369: done queuing things up, now waiting for results queue to drain 12180 1727204075.04370: waiting for pending results... 12180 1727204075.04688: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 12180 1727204075.04827: in run() - task 0affcd87-79f5-ccb1-55ae-00000000006e 12180 1727204075.04843: variable 'ansible_search_path' from source: unknown 12180 1727204075.04847: variable 'ansible_search_path' from source: unknown 12180 1727204075.04887: calling self._execute() 12180 1727204075.04988: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.04991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.05002: variable 'omit' from source: magic vars 12180 1727204075.05429: variable 'ansible_distribution_major_version' from source: facts 12180 1727204075.05448: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204075.05455: _execute() done 12180 1727204075.05458: dumping result to json 12180 1727204075.05469: done dumping result, returning 12180 1727204075.05475: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-ccb1-55ae-00000000006e] 12180 1727204075.05482: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000006e 12180 1727204075.05578: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000006e 12180 1727204075.05581: WORKER PROCESS EXITING 12180 1727204075.05612: no more pending results, returning what we have 12180 1727204075.05618: in VariableManager get_vars() 12180 1727204075.05680: Calling all_inventory to load vars for managed-node1 12180 1727204075.05684: Calling groups_inventory to load vars for managed-node1 12180 1727204075.05687: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204075.05702: Calling all_plugins_play to load vars for managed-node1 12180 1727204075.05706: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204075.05710: Calling groups_plugins_play to load vars for managed-node1 12180 1727204075.07472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204075.09220: done with get_vars() 12180 1727204075.09257: variable 'ansible_search_path' from source: unknown 12180 1727204075.09259: variable 'ansible_search_path' from source: unknown 12180 1727204075.09303: we have included files to process 12180 1727204075.09304: generating all_blocks data 12180 1727204075.09306: done generating all_blocks data 12180 1727204075.09311: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12180 1727204075.09312: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12180 1727204075.09315: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12180 1727204075.09538: done processing included file 12180 1727204075.09541: iterating over new_blocks loaded from include file 12180 1727204075.09543: in VariableManager get_vars() 12180 1727204075.09570: done with get_vars() 12180 1727204075.09572: filtering new block on tags 12180 1727204075.09591: done filtering new block on tags 12180 1727204075.09593: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 12180 1727204075.09598: extending task lists for all hosts with included blocks 12180 1727204075.09713: done extending task lists 12180 1727204075.09715: done processing included files 12180 1727204075.09716: results queue empty 12180 1727204075.09716: checking for any_errors_fatal 12180 1727204075.09718: done checking for any_errors_fatal 12180 1727204075.09718: checking for max_fail_percentage 12180 1727204075.09720: done checking for max_fail_percentage 12180 1727204075.09720: checking to see if all hosts have failed and the running result is not ok 12180 1727204075.09721: done checking to see if all hosts have failed 12180 1727204075.09722: getting the remaining hosts for this loop 12180 1727204075.09723: done getting the remaining hosts for this loop 12180 1727204075.09726: getting the next task for host managed-node1 12180 1727204075.09732: done getting next task for host managed-node1 12180 1727204075.09734: ^ task is: TASK: Get stat for interface {{ interface }} 12180 1727204075.09737: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204075.09740: getting variables 12180 1727204075.09741: in VariableManager get_vars() 12180 1727204075.09755: Calling all_inventory to load vars for managed-node1 12180 1727204075.09757: Calling groups_inventory to load vars for managed-node1 12180 1727204075.09759: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204075.09767: Calling all_plugins_play to load vars for managed-node1 12180 1727204075.09770: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204075.09773: Calling groups_plugins_play to load vars for managed-node1 12180 1727204075.11116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204075.12865: done with get_vars() 12180 1727204075.12895: done getting variables 12180 1727204075.13089: variable 'interface' from source: task vars 12180 1727204075.13093: variable 'controller_device' from source: play vars 12180 1727204075.13166: variable 'controller_device' from source: play vars TASK [Get stat for interface deprecated-bond] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:35 -0400 (0:00:00.092) 0:00:22.543 ***** 12180 1727204075.13199: entering _queue_task() for managed-node1/stat 12180 1727204075.13545: worker is 1 (out of 1 available) 12180 1727204075.13556: exiting _queue_task() for managed-node1/stat 12180 1727204075.13572: done queuing things up, now waiting for results queue to drain 12180 1727204075.13573: waiting for pending results... 12180 1727204075.13877: running TaskExecutor() for managed-node1/TASK: Get stat for interface deprecated-bond 12180 1727204075.14009: in run() - task 0affcd87-79f5-ccb1-55ae-000000000242 12180 1727204075.14030: variable 'ansible_search_path' from source: unknown 12180 1727204075.14034: variable 'ansible_search_path' from source: unknown 12180 1727204075.14072: calling self._execute() 12180 1727204075.14182: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.14186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.14189: variable 'omit' from source: magic vars 12180 1727204075.14483: variable 'ansible_distribution_major_version' from source: facts 12180 1727204075.14493: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204075.14498: variable 'omit' from source: magic vars 12180 1727204075.14538: variable 'omit' from source: magic vars 12180 1727204075.14609: variable 'interface' from source: task vars 12180 1727204075.14613: variable 'controller_device' from source: play vars 12180 1727204075.14660: variable 'controller_device' from source: play vars 12180 1727204075.14677: variable 'omit' from source: magic vars 12180 1727204075.14712: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204075.14741: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204075.14758: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204075.14772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204075.14782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204075.14809: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204075.14812: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.14815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.14888: Set connection var ansible_pipelining to False 12180 1727204075.14892: Set connection var ansible_shell_type to sh 12180 1727204075.14894: Set connection var ansible_timeout to 10 12180 1727204075.14898: Set connection var ansible_connection to ssh 12180 1727204075.14903: Set connection var ansible_shell_executable to /bin/sh 12180 1727204075.14911: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204075.14929: variable 'ansible_shell_executable' from source: unknown 12180 1727204075.14934: variable 'ansible_connection' from source: unknown 12180 1727204075.14937: variable 'ansible_module_compression' from source: unknown 12180 1727204075.14939: variable 'ansible_shell_type' from source: unknown 12180 1727204075.14942: variable 'ansible_shell_executable' from source: unknown 12180 1727204075.14946: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.14950: variable 'ansible_pipelining' from source: unknown 12180 1727204075.14952: variable 'ansible_timeout' from source: unknown 12180 1727204075.14956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.15106: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204075.15112: variable 'omit' from source: magic vars 12180 1727204075.15120: starting attempt loop 12180 1727204075.15123: running the handler 12180 1727204075.15137: _low_level_execute_command(): starting 12180 1727204075.15144: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204075.15669: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204075.15684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.15706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.15719: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.15778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204075.15791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204075.15855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204075.17472: stdout chunk (state=3): >>>/root <<< 12180 1727204075.17579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204075.17631: stderr chunk (state=3): >>><<< 12180 1727204075.17636: stdout chunk (state=3): >>><<< 12180 1727204075.17657: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204075.17669: _low_level_execute_command(): starting 12180 1727204075.17675: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204075.1765654-14740-124843210074900 `" && echo ansible-tmp-1727204075.1765654-14740-124843210074900="` echo /root/.ansible/tmp/ansible-tmp-1727204075.1765654-14740-124843210074900 `" ) && sleep 0' 12180 1727204075.18120: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204075.18124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.18159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.18176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204075.18185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.18191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.18249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204075.18252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204075.18262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204075.18332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204075.20201: stdout chunk (state=3): >>>ansible-tmp-1727204075.1765654-14740-124843210074900=/root/.ansible/tmp/ansible-tmp-1727204075.1765654-14740-124843210074900 <<< 12180 1727204075.20384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204075.20388: stderr chunk (state=3): >>><<< 12180 1727204075.20390: stdout chunk (state=3): >>><<< 12180 1727204075.20416: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204075.1765654-14740-124843210074900=/root/.ansible/tmp/ansible-tmp-1727204075.1765654-14740-124843210074900 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204075.20465: variable 'ansible_module_compression' from source: unknown 12180 1727204075.20527: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12180 1727204075.20560: variable 'ansible_facts' from source: unknown 12180 1727204075.20650: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204075.1765654-14740-124843210074900/AnsiballZ_stat.py 12180 1727204075.20801: Sending initial data 12180 1727204075.20805: Sent initial data (153 bytes) 12180 1727204075.21841: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204075.21858: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204075.21870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204075.21884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.21923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204075.21933: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204075.21940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.21956: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204075.21973: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204075.21980: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204075.21988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204075.21997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204075.22009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.22017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204075.22024: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204075.22034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.22112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204075.22127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204075.22142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204075.22226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204075.23989: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204075.24041: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204075.24100: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpq54pw53e /root/.ansible/tmp/ansible-tmp-1727204075.1765654-14740-124843210074900/AnsiballZ_stat.py <<< 12180 1727204075.24160: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204075.25276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204075.25457: stderr chunk (state=3): >>><<< 12180 1727204075.25460: stdout chunk (state=3): >>><<< 12180 1727204075.25486: done transferring module to remote 12180 1727204075.25497: _low_level_execute_command(): starting 12180 1727204075.25502: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204075.1765654-14740-124843210074900/ /root/.ansible/tmp/ansible-tmp-1727204075.1765654-14740-124843210074900/AnsiballZ_stat.py && sleep 0' 12180 1727204075.26179: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204075.26190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204075.26202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204075.26216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.26261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204075.26272: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204075.26283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.26297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204075.26305: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204075.26311: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204075.26319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204075.26331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204075.26345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.26353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204075.26360: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204075.26371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.26445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204075.26462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204075.26474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204075.26560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204075.28288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204075.28384: stderr chunk (state=3): >>><<< 12180 1727204075.28388: stdout chunk (state=3): >>><<< 12180 1727204075.28405: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204075.28411: _low_level_execute_command(): starting 12180 1727204075.28413: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204075.1765654-14740-124843210074900/AnsiballZ_stat.py && sleep 0' 12180 1727204075.29057: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204075.29071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204075.29078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204075.29093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.29135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204075.29141: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204075.29151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.29169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204075.29178: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204075.29184: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204075.29192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204075.29200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204075.29214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.29219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204075.29226: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204075.29235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.29307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204075.29333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204075.29336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204075.29435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204075.42481: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/deprecated-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26257, "dev": 21, "nlink": 1, "atime": 1727204073.9990454, "mtime": 1727204073.9990454, "ctime": 1727204073.9990454, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/deprecated-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12180 1727204075.43431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204075.43490: stderr chunk (state=3): >>><<< 12180 1727204075.43494: stdout chunk (state=3): >>><<< 12180 1727204075.43510: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/deprecated-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26257, "dev": 21, "nlink": 1, "atime": 1727204073.9990454, "mtime": 1727204073.9990454, "ctime": 1727204073.9990454, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/deprecated-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204075.43556: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204075.1765654-14740-124843210074900/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204075.43565: _low_level_execute_command(): starting 12180 1727204075.43572: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204075.1765654-14740-124843210074900/ > /dev/null 2>&1 && sleep 0' 12180 1727204075.44041: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204075.44045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.44080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204075.44084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12180 1727204075.44098: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204075.44101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.44109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204075.44122: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204075.44126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.44184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204075.44192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204075.44214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204075.44271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204075.46139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204075.46168: stderr chunk (state=3): >>><<< 12180 1727204075.46172: stdout chunk (state=3): >>><<< 12180 1727204075.46618: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204075.46622: handler run complete 12180 1727204075.46625: attempt loop complete, returning result 12180 1727204075.46627: _execute() done 12180 1727204075.46632: dumping result to json 12180 1727204075.46634: done dumping result, returning 12180 1727204075.46636: done running TaskExecutor() for managed-node1/TASK: Get stat for interface deprecated-bond [0affcd87-79f5-ccb1-55ae-000000000242] 12180 1727204075.46638: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000242 12180 1727204075.46718: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000242 12180 1727204075.46721: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204073.9990454, "block_size": 4096, "blocks": 0, "ctime": 1727204073.9990454, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26257, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "mode": "0777", "mtime": 1727204073.9990454, "nlink": 1, "path": "/sys/class/net/deprecated-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12180 1727204075.46817: no more pending results, returning what we have 12180 1727204075.46821: results queue empty 12180 1727204075.46822: checking for any_errors_fatal 12180 1727204075.46824: done checking for any_errors_fatal 12180 1727204075.46825: checking for max_fail_percentage 12180 1727204075.46827: done checking for max_fail_percentage 12180 1727204075.46829: checking to see if all hosts have failed and the running result is not ok 12180 1727204075.46830: done checking to see if all hosts have failed 12180 1727204075.46831: getting the remaining hosts for this loop 12180 1727204075.46832: done getting the remaining hosts for this loop 12180 1727204075.46836: getting the next task for host managed-node1 12180 1727204075.46845: done getting next task for host managed-node1 12180 1727204075.46847: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12180 1727204075.46850: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204075.46854: getting variables 12180 1727204075.46856: in VariableManager get_vars() 12180 1727204075.46904: Calling all_inventory to load vars for managed-node1 12180 1727204075.46907: Calling groups_inventory to load vars for managed-node1 12180 1727204075.46910: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204075.46921: Calling all_plugins_play to load vars for managed-node1 12180 1727204075.46924: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204075.46933: Calling groups_plugins_play to load vars for managed-node1 12180 1727204075.48545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204075.49585: done with get_vars() 12180 1727204075.49604: done getting variables 12180 1727204075.49652: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204075.49746: variable 'interface' from source: task vars 12180 1727204075.49749: variable 'controller_device' from source: play vars 12180 1727204075.49795: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'deprecated-bond'] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:35 -0400 (0:00:00.366) 0:00:22.910 ***** 12180 1727204075.49821: entering _queue_task() for managed-node1/assert 12180 1727204075.50062: worker is 1 (out of 1 available) 12180 1727204075.50075: exiting _queue_task() for managed-node1/assert 12180 1727204075.50090: done queuing things up, now waiting for results queue to drain 12180 1727204075.50091: waiting for pending results... 12180 1727204075.50406: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'deprecated-bond' 12180 1727204075.50482: in run() - task 0affcd87-79f5-ccb1-55ae-00000000006f 12180 1727204075.50494: variable 'ansible_search_path' from source: unknown 12180 1727204075.50498: variable 'ansible_search_path' from source: unknown 12180 1727204075.50535: calling self._execute() 12180 1727204075.50625: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.50633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.50642: variable 'omit' from source: magic vars 12180 1727204075.50987: variable 'ansible_distribution_major_version' from source: facts 12180 1727204075.50999: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204075.51006: variable 'omit' from source: magic vars 12180 1727204075.51051: variable 'omit' from source: magic vars 12180 1727204075.51147: variable 'interface' from source: task vars 12180 1727204075.51151: variable 'controller_device' from source: play vars 12180 1727204075.51215: variable 'controller_device' from source: play vars 12180 1727204075.51234: variable 'omit' from source: magic vars 12180 1727204075.51279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204075.51311: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204075.51335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204075.51350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204075.51372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204075.51397: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204075.51400: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.51402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.51502: Set connection var ansible_pipelining to False 12180 1727204075.51505: Set connection var ansible_shell_type to sh 12180 1727204075.51519: Set connection var ansible_timeout to 10 12180 1727204075.51522: Set connection var ansible_connection to ssh 12180 1727204075.51526: Set connection var ansible_shell_executable to /bin/sh 12180 1727204075.51531: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204075.51556: variable 'ansible_shell_executable' from source: unknown 12180 1727204075.51559: variable 'ansible_connection' from source: unknown 12180 1727204075.51562: variable 'ansible_module_compression' from source: unknown 12180 1727204075.51566: variable 'ansible_shell_type' from source: unknown 12180 1727204075.51569: variable 'ansible_shell_executable' from source: unknown 12180 1727204075.51571: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.51573: variable 'ansible_pipelining' from source: unknown 12180 1727204075.51575: variable 'ansible_timeout' from source: unknown 12180 1727204075.51578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.51716: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204075.51722: variable 'omit' from source: magic vars 12180 1727204075.51728: starting attempt loop 12180 1727204075.51735: running the handler 12180 1727204075.51878: variable 'interface_stat' from source: set_fact 12180 1727204075.51890: Evaluated conditional (interface_stat.stat.exists): True 12180 1727204075.51896: handler run complete 12180 1727204075.51906: attempt loop complete, returning result 12180 1727204075.51909: _execute() done 12180 1727204075.51912: dumping result to json 12180 1727204075.51914: done dumping result, returning 12180 1727204075.51921: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'deprecated-bond' [0affcd87-79f5-ccb1-55ae-00000000006f] 12180 1727204075.51925: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000006f 12180 1727204075.52024: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000006f 12180 1727204075.52026: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12180 1727204075.52077: no more pending results, returning what we have 12180 1727204075.52081: results queue empty 12180 1727204075.52082: checking for any_errors_fatal 12180 1727204075.52091: done checking for any_errors_fatal 12180 1727204075.52092: checking for max_fail_percentage 12180 1727204075.52094: done checking for max_fail_percentage 12180 1727204075.52095: checking to see if all hosts have failed and the running result is not ok 12180 1727204075.52096: done checking to see if all hosts have failed 12180 1727204075.52096: getting the remaining hosts for this loop 12180 1727204075.52097: done getting the remaining hosts for this loop 12180 1727204075.52101: getting the next task for host managed-node1 12180 1727204075.52109: done getting next task for host managed-node1 12180 1727204075.52112: ^ task is: TASK: Include the task 'assert_profile_present.yml' 12180 1727204075.52114: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204075.52117: getting variables 12180 1727204075.52119: in VariableManager get_vars() 12180 1727204075.52162: Calling all_inventory to load vars for managed-node1 12180 1727204075.52171: Calling groups_inventory to load vars for managed-node1 12180 1727204075.52174: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204075.52185: Calling all_plugins_play to load vars for managed-node1 12180 1727204075.52187: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204075.52190: Calling groups_plugins_play to load vars for managed-node1 12180 1727204075.53018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204075.54128: done with get_vars() 12180 1727204075.54161: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:67 Tuesday 24 September 2024 14:54:35 -0400 (0:00:00.044) 0:00:22.954 ***** 12180 1727204075.54267: entering _queue_task() for managed-node1/include_tasks 12180 1727204075.54608: worker is 1 (out of 1 available) 12180 1727204075.54621: exiting _queue_task() for managed-node1/include_tasks 12180 1727204075.54634: done queuing things up, now waiting for results queue to drain 12180 1727204075.54635: waiting for pending results... 12180 1727204075.54936: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_present.yml' 12180 1727204075.55048: in run() - task 0affcd87-79f5-ccb1-55ae-000000000070 12180 1727204075.55072: variable 'ansible_search_path' from source: unknown 12180 1727204075.55132: variable 'controller_profile' from source: play vars 12180 1727204075.55332: variable 'controller_profile' from source: play vars 12180 1727204075.55353: variable 'port1_profile' from source: play vars 12180 1727204075.55433: variable 'port1_profile' from source: play vars 12180 1727204075.55447: variable 'port2_profile' from source: play vars 12180 1727204075.55518: variable 'port2_profile' from source: play vars 12180 1727204075.55541: variable 'omit' from source: magic vars 12180 1727204075.55692: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.55706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.55721: variable 'omit' from source: magic vars 12180 1727204075.55952: variable 'ansible_distribution_major_version' from source: facts 12180 1727204075.55976: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204075.55992: variable 'item' from source: unknown 12180 1727204075.56041: variable 'item' from source: unknown 12180 1727204075.56159: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.56164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.56167: variable 'omit' from source: magic vars 12180 1727204075.56254: variable 'ansible_distribution_major_version' from source: facts 12180 1727204075.56258: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204075.56284: variable 'item' from source: unknown 12180 1727204075.56325: variable 'item' from source: unknown 12180 1727204075.56399: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.56402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.56405: variable 'omit' from source: magic vars 12180 1727204075.56508: variable 'ansible_distribution_major_version' from source: facts 12180 1727204075.56511: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204075.56535: variable 'item' from source: unknown 12180 1727204075.56578: variable 'item' from source: unknown 12180 1727204075.56646: dumping result to json 12180 1727204075.56649: done dumping result, returning 12180 1727204075.56652: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_present.yml' [0affcd87-79f5-ccb1-55ae-000000000070] 12180 1727204075.56654: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000070 12180 1727204075.56690: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000070 12180 1727204075.56693: WORKER PROCESS EXITING 12180 1727204075.56778: no more pending results, returning what we have 12180 1727204075.56782: in VariableManager get_vars() 12180 1727204075.56824: Calling all_inventory to load vars for managed-node1 12180 1727204075.56826: Calling groups_inventory to load vars for managed-node1 12180 1727204075.56828: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204075.56839: Calling all_plugins_play to load vars for managed-node1 12180 1727204075.56843: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204075.56845: Calling groups_plugins_play to load vars for managed-node1 12180 1727204075.57972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204075.59558: done with get_vars() 12180 1727204075.59584: variable 'ansible_search_path' from source: unknown 12180 1727204075.59602: variable 'ansible_search_path' from source: unknown 12180 1727204075.59611: variable 'ansible_search_path' from source: unknown 12180 1727204075.59618: we have included files to process 12180 1727204075.59619: generating all_blocks data 12180 1727204075.59620: done generating all_blocks data 12180 1727204075.59624: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12180 1727204075.59625: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12180 1727204075.59627: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12180 1727204075.59839: in VariableManager get_vars() 12180 1727204075.59867: done with get_vars() 12180 1727204075.60139: done processing included file 12180 1727204075.60140: iterating over new_blocks loaded from include file 12180 1727204075.60141: in VariableManager get_vars() 12180 1727204075.60155: done with get_vars() 12180 1727204075.60156: filtering new block on tags 12180 1727204075.60171: done filtering new block on tags 12180 1727204075.60173: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=bond0) 12180 1727204075.60176: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12180 1727204075.60177: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12180 1727204075.60179: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12180 1727204075.60251: in VariableManager get_vars() 12180 1727204075.60268: done with get_vars() 12180 1727204075.60418: done processing included file 12180 1727204075.60419: iterating over new_blocks loaded from include file 12180 1727204075.60420: in VariableManager get_vars() 12180 1727204075.60434: done with get_vars() 12180 1727204075.60435: filtering new block on tags 12180 1727204075.60446: done filtering new block on tags 12180 1727204075.60447: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=bond0.0) 12180 1727204075.60450: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12180 1727204075.60451: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12180 1727204075.60453: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12180 1727204075.60516: in VariableManager get_vars() 12180 1727204075.60579: done with get_vars() 12180 1727204075.60726: done processing included file 12180 1727204075.60727: iterating over new_blocks loaded from include file 12180 1727204075.60730: in VariableManager get_vars() 12180 1727204075.60741: done with get_vars() 12180 1727204075.60743: filtering new block on tags 12180 1727204075.60754: done filtering new block on tags 12180 1727204075.60755: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=bond0.1) 12180 1727204075.60758: extending task lists for all hosts with included blocks 12180 1727204075.62828: done extending task lists 12180 1727204075.62838: done processing included files 12180 1727204075.62839: results queue empty 12180 1727204075.62840: checking for any_errors_fatal 12180 1727204075.62844: done checking for any_errors_fatal 12180 1727204075.62845: checking for max_fail_percentage 12180 1727204075.62846: done checking for max_fail_percentage 12180 1727204075.62847: checking to see if all hosts have failed and the running result is not ok 12180 1727204075.62848: done checking to see if all hosts have failed 12180 1727204075.62848: getting the remaining hosts for this loop 12180 1727204075.62850: done getting the remaining hosts for this loop 12180 1727204075.62852: getting the next task for host managed-node1 12180 1727204075.62857: done getting next task for host managed-node1 12180 1727204075.62859: ^ task is: TASK: Include the task 'get_profile_stat.yml' 12180 1727204075.62862: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204075.62866: getting variables 12180 1727204075.62868: in VariableManager get_vars() 12180 1727204075.62885: Calling all_inventory to load vars for managed-node1 12180 1727204075.62887: Calling groups_inventory to load vars for managed-node1 12180 1727204075.62889: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204075.62896: Calling all_plugins_play to load vars for managed-node1 12180 1727204075.62899: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204075.62902: Calling groups_plugins_play to load vars for managed-node1 12180 1727204075.63707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204075.64711: done with get_vars() 12180 1727204075.64732: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:35 -0400 (0:00:00.105) 0:00:23.059 ***** 12180 1727204075.64796: entering _queue_task() for managed-node1/include_tasks 12180 1727204075.65046: worker is 1 (out of 1 available) 12180 1727204075.65060: exiting _queue_task() for managed-node1/include_tasks 12180 1727204075.65074: done queuing things up, now waiting for results queue to drain 12180 1727204075.65076: waiting for pending results... 12180 1727204075.65246: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 12180 1727204075.65337: in run() - task 0affcd87-79f5-ccb1-55ae-000000000260 12180 1727204075.65349: variable 'ansible_search_path' from source: unknown 12180 1727204075.65353: variable 'ansible_search_path' from source: unknown 12180 1727204075.65384: calling self._execute() 12180 1727204075.65456: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.65460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.65469: variable 'omit' from source: magic vars 12180 1727204075.65745: variable 'ansible_distribution_major_version' from source: facts 12180 1727204075.65754: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204075.65760: _execute() done 12180 1727204075.65765: dumping result to json 12180 1727204075.65768: done dumping result, returning 12180 1727204075.65774: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-ccb1-55ae-000000000260] 12180 1727204075.65780: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000260 12180 1727204075.65866: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000260 12180 1727204075.65869: WORKER PROCESS EXITING 12180 1727204075.65899: no more pending results, returning what we have 12180 1727204075.65904: in VariableManager get_vars() 12180 1727204075.65952: Calling all_inventory to load vars for managed-node1 12180 1727204075.65955: Calling groups_inventory to load vars for managed-node1 12180 1727204075.65957: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204075.65973: Calling all_plugins_play to load vars for managed-node1 12180 1727204075.65975: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204075.65985: Calling groups_plugins_play to load vars for managed-node1 12180 1727204075.70167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204075.71066: done with get_vars() 12180 1727204075.71083: variable 'ansible_search_path' from source: unknown 12180 1727204075.71084: variable 'ansible_search_path' from source: unknown 12180 1727204075.71110: we have included files to process 12180 1727204075.71111: generating all_blocks data 12180 1727204075.71112: done generating all_blocks data 12180 1727204075.71112: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12180 1727204075.71113: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12180 1727204075.71115: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12180 1727204075.71780: done processing included file 12180 1727204075.71782: iterating over new_blocks loaded from include file 12180 1727204075.71783: in VariableManager get_vars() 12180 1727204075.71797: done with get_vars() 12180 1727204075.71798: filtering new block on tags 12180 1727204075.71813: done filtering new block on tags 12180 1727204075.71814: in VariableManager get_vars() 12180 1727204075.71826: done with get_vars() 12180 1727204075.71827: filtering new block on tags 12180 1727204075.71844: done filtering new block on tags 12180 1727204075.71845: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 12180 1727204075.71848: extending task lists for all hosts with included blocks 12180 1727204075.71993: done extending task lists 12180 1727204075.71995: done processing included files 12180 1727204075.71995: results queue empty 12180 1727204075.71996: checking for any_errors_fatal 12180 1727204075.71997: done checking for any_errors_fatal 12180 1727204075.71998: checking for max_fail_percentage 12180 1727204075.71999: done checking for max_fail_percentage 12180 1727204075.71999: checking to see if all hosts have failed and the running result is not ok 12180 1727204075.72000: done checking to see if all hosts have failed 12180 1727204075.72000: getting the remaining hosts for this loop 12180 1727204075.72001: done getting the remaining hosts for this loop 12180 1727204075.72002: getting the next task for host managed-node1 12180 1727204075.72005: done getting next task for host managed-node1 12180 1727204075.72006: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 12180 1727204075.72009: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204075.72010: getting variables 12180 1727204075.72011: in VariableManager get_vars() 12180 1727204075.72020: Calling all_inventory to load vars for managed-node1 12180 1727204075.72022: Calling groups_inventory to load vars for managed-node1 12180 1727204075.72023: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204075.72027: Calling all_plugins_play to load vars for managed-node1 12180 1727204075.72030: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204075.72032: Calling groups_plugins_play to load vars for managed-node1 12180 1727204075.72714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204075.73617: done with get_vars() 12180 1727204075.73634: done getting variables 12180 1727204075.73666: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:35 -0400 (0:00:00.088) 0:00:23.148 ***** 12180 1727204075.73686: entering _queue_task() for managed-node1/set_fact 12180 1727204075.73925: worker is 1 (out of 1 available) 12180 1727204075.73940: exiting _queue_task() for managed-node1/set_fact 12180 1727204075.73954: done queuing things up, now waiting for results queue to drain 12180 1727204075.73956: waiting for pending results... 12180 1727204075.74144: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 12180 1727204075.74224: in run() - task 0affcd87-79f5-ccb1-55ae-0000000003b3 12180 1727204075.74238: variable 'ansible_search_path' from source: unknown 12180 1727204075.74242: variable 'ansible_search_path' from source: unknown 12180 1727204075.74271: calling self._execute() 12180 1727204075.74344: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.74348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.74357: variable 'omit' from source: magic vars 12180 1727204075.74647: variable 'ansible_distribution_major_version' from source: facts 12180 1727204075.74658: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204075.74665: variable 'omit' from source: magic vars 12180 1727204075.74698: variable 'omit' from source: magic vars 12180 1727204075.74723: variable 'omit' from source: magic vars 12180 1727204075.74763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204075.74791: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204075.74808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204075.74822: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204075.74833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204075.74860: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204075.74865: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.74868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.74939: Set connection var ansible_pipelining to False 12180 1727204075.74944: Set connection var ansible_shell_type to sh 12180 1727204075.74947: Set connection var ansible_timeout to 10 12180 1727204075.74954: Set connection var ansible_connection to ssh 12180 1727204075.74956: Set connection var ansible_shell_executable to /bin/sh 12180 1727204075.74964: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204075.74986: variable 'ansible_shell_executable' from source: unknown 12180 1727204075.74990: variable 'ansible_connection' from source: unknown 12180 1727204075.74993: variable 'ansible_module_compression' from source: unknown 12180 1727204075.74995: variable 'ansible_shell_type' from source: unknown 12180 1727204075.74998: variable 'ansible_shell_executable' from source: unknown 12180 1727204075.75000: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.75002: variable 'ansible_pipelining' from source: unknown 12180 1727204075.75004: variable 'ansible_timeout' from source: unknown 12180 1727204075.75007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.75112: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204075.75121: variable 'omit' from source: magic vars 12180 1727204075.75126: starting attempt loop 12180 1727204075.75129: running the handler 12180 1727204075.75142: handler run complete 12180 1727204075.75150: attempt loop complete, returning result 12180 1727204075.75154: _execute() done 12180 1727204075.75156: dumping result to json 12180 1727204075.75159: done dumping result, returning 12180 1727204075.75166: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-ccb1-55ae-0000000003b3] 12180 1727204075.75173: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003b3 12180 1727204075.75255: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003b3 12180 1727204075.75258: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 12180 1727204075.75329: no more pending results, returning what we have 12180 1727204075.75332: results queue empty 12180 1727204075.75333: checking for any_errors_fatal 12180 1727204075.75334: done checking for any_errors_fatal 12180 1727204075.75335: checking for max_fail_percentage 12180 1727204075.75337: done checking for max_fail_percentage 12180 1727204075.75338: checking to see if all hosts have failed and the running result is not ok 12180 1727204075.75339: done checking to see if all hosts have failed 12180 1727204075.75339: getting the remaining hosts for this loop 12180 1727204075.75340: done getting the remaining hosts for this loop 12180 1727204075.75344: getting the next task for host managed-node1 12180 1727204075.75351: done getting next task for host managed-node1 12180 1727204075.75354: ^ task is: TASK: Stat profile file 12180 1727204075.75358: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204075.75362: getting variables 12180 1727204075.75368: in VariableManager get_vars() 12180 1727204075.75411: Calling all_inventory to load vars for managed-node1 12180 1727204075.75414: Calling groups_inventory to load vars for managed-node1 12180 1727204075.75415: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204075.75425: Calling all_plugins_play to load vars for managed-node1 12180 1727204075.75427: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204075.75430: Calling groups_plugins_play to load vars for managed-node1 12180 1727204075.76248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204075.77182: done with get_vars() 12180 1727204075.77200: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:35 -0400 (0:00:00.035) 0:00:23.184 ***** 12180 1727204075.77275: entering _queue_task() for managed-node1/stat 12180 1727204075.77508: worker is 1 (out of 1 available) 12180 1727204075.77521: exiting _queue_task() for managed-node1/stat 12180 1727204075.77533: done queuing things up, now waiting for results queue to drain 12180 1727204075.77535: waiting for pending results... 12180 1727204075.77714: running TaskExecutor() for managed-node1/TASK: Stat profile file 12180 1727204075.77795: in run() - task 0affcd87-79f5-ccb1-55ae-0000000003b4 12180 1727204075.77806: variable 'ansible_search_path' from source: unknown 12180 1727204075.77810: variable 'ansible_search_path' from source: unknown 12180 1727204075.77842: calling self._execute() 12180 1727204075.77915: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.77918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.77927: variable 'omit' from source: magic vars 12180 1727204075.78214: variable 'ansible_distribution_major_version' from source: facts 12180 1727204075.78225: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204075.78232: variable 'omit' from source: magic vars 12180 1727204075.78266: variable 'omit' from source: magic vars 12180 1727204075.78338: variable 'profile' from source: include params 12180 1727204075.78341: variable 'item' from source: include params 12180 1727204075.78387: variable 'item' from source: include params 12180 1727204075.78402: variable 'omit' from source: magic vars 12180 1727204075.78443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204075.78472: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204075.78489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204075.78501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204075.78510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204075.78542: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204075.78545: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.78548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.78618: Set connection var ansible_pipelining to False 12180 1727204075.78622: Set connection var ansible_shell_type to sh 12180 1727204075.78625: Set connection var ansible_timeout to 10 12180 1727204075.78631: Set connection var ansible_connection to ssh 12180 1727204075.78640: Set connection var ansible_shell_executable to /bin/sh 12180 1727204075.78643: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204075.78666: variable 'ansible_shell_executable' from source: unknown 12180 1727204075.78669: variable 'ansible_connection' from source: unknown 12180 1727204075.78672: variable 'ansible_module_compression' from source: unknown 12180 1727204075.78675: variable 'ansible_shell_type' from source: unknown 12180 1727204075.78677: variable 'ansible_shell_executable' from source: unknown 12180 1727204075.78679: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204075.78682: variable 'ansible_pipelining' from source: unknown 12180 1727204075.78684: variable 'ansible_timeout' from source: unknown 12180 1727204075.78687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204075.78839: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204075.78848: variable 'omit' from source: magic vars 12180 1727204075.78852: starting attempt loop 12180 1727204075.78856: running the handler 12180 1727204075.78872: _low_level_execute_command(): starting 12180 1727204075.78879: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204075.79412: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204075.79424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.79449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.79462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.79523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204075.79530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204075.79607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204075.81280: stdout chunk (state=3): >>>/root <<< 12180 1727204075.81382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204075.81448: stderr chunk (state=3): >>><<< 12180 1727204075.81451: stdout chunk (state=3): >>><<< 12180 1727204075.81474: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204075.81487: _low_level_execute_command(): starting 12180 1727204075.81493: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204075.8147473-14772-110647723977207 `" && echo ansible-tmp-1727204075.8147473-14772-110647723977207="` echo /root/.ansible/tmp/ansible-tmp-1727204075.8147473-14772-110647723977207 `" ) && sleep 0' 12180 1727204075.81977: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204075.81988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.82024: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.82039: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204075.82050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.82099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204075.82111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204075.82121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204075.82185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204075.84050: stdout chunk (state=3): >>>ansible-tmp-1727204075.8147473-14772-110647723977207=/root/.ansible/tmp/ansible-tmp-1727204075.8147473-14772-110647723977207 <<< 12180 1727204075.84162: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204075.84221: stderr chunk (state=3): >>><<< 12180 1727204075.84226: stdout chunk (state=3): >>><<< 12180 1727204075.84248: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204075.8147473-14772-110647723977207=/root/.ansible/tmp/ansible-tmp-1727204075.8147473-14772-110647723977207 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204075.84291: variable 'ansible_module_compression' from source: unknown 12180 1727204075.84343: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12180 1727204075.84373: variable 'ansible_facts' from source: unknown 12180 1727204075.84437: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204075.8147473-14772-110647723977207/AnsiballZ_stat.py 12180 1727204075.84555: Sending initial data 12180 1727204075.84558: Sent initial data (153 bytes) 12180 1727204075.85265: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204075.85272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.85305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.85317: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.85380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204075.85386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204075.85457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204075.87189: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204075.87235: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204075.87290: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmphjkr6tk0 /root/.ansible/tmp/ansible-tmp-1727204075.8147473-14772-110647723977207/AnsiballZ_stat.py <<< 12180 1727204075.87339: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204075.88176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204075.88293: stderr chunk (state=3): >>><<< 12180 1727204075.88296: stdout chunk (state=3): >>><<< 12180 1727204075.88313: done transferring module to remote 12180 1727204075.88322: _low_level_execute_command(): starting 12180 1727204075.88334: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204075.8147473-14772-110647723977207/ /root/.ansible/tmp/ansible-tmp-1727204075.8147473-14772-110647723977207/AnsiballZ_stat.py && sleep 0' 12180 1727204075.88809: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204075.88822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.88842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204075.88853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 12180 1727204075.88863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.88911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204075.88923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204075.88988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204075.90708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204075.90774: stderr chunk (state=3): >>><<< 12180 1727204075.90778: stdout chunk (state=3): >>><<< 12180 1727204075.90792: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204075.90795: _low_level_execute_command(): starting 12180 1727204075.90804: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204075.8147473-14772-110647723977207/AnsiballZ_stat.py && sleep 0' 12180 1727204075.91282: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204075.91286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204075.91321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.91334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204075.91396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204075.91403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204075.91481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204076.04457: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12180 1727204076.05427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204076.05493: stderr chunk (state=3): >>><<< 12180 1727204076.05497: stdout chunk (state=3): >>><<< 12180 1727204076.05511: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204076.05540: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204075.8147473-14772-110647723977207/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204076.05546: _low_level_execute_command(): starting 12180 1727204076.05555: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204075.8147473-14772-110647723977207/ > /dev/null 2>&1 && sleep 0' 12180 1727204076.06032: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204076.06036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204076.06072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204076.06084: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204076.06141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204076.06153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204076.06215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204076.08004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204076.08067: stderr chunk (state=3): >>><<< 12180 1727204076.08071: stdout chunk (state=3): >>><<< 12180 1727204076.08085: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204076.08093: handler run complete 12180 1727204076.08109: attempt loop complete, returning result 12180 1727204076.08112: _execute() done 12180 1727204076.08114: dumping result to json 12180 1727204076.08119: done dumping result, returning 12180 1727204076.08127: done running TaskExecutor() for managed-node1/TASK: Stat profile file [0affcd87-79f5-ccb1-55ae-0000000003b4] 12180 1727204076.08137: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003b4 12180 1727204076.08234: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003b4 12180 1727204076.08237: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 12180 1727204076.08293: no more pending results, returning what we have 12180 1727204076.08297: results queue empty 12180 1727204076.08298: checking for any_errors_fatal 12180 1727204076.08302: done checking for any_errors_fatal 12180 1727204076.08302: checking for max_fail_percentage 12180 1727204076.08304: done checking for max_fail_percentage 12180 1727204076.08305: checking to see if all hosts have failed and the running result is not ok 12180 1727204076.08306: done checking to see if all hosts have failed 12180 1727204076.08307: getting the remaining hosts for this loop 12180 1727204076.08308: done getting the remaining hosts for this loop 12180 1727204076.08312: getting the next task for host managed-node1 12180 1727204076.08319: done getting next task for host managed-node1 12180 1727204076.08321: ^ task is: TASK: Set NM profile exist flag based on the profile files 12180 1727204076.08326: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204076.08332: getting variables 12180 1727204076.08334: in VariableManager get_vars() 12180 1727204076.08378: Calling all_inventory to load vars for managed-node1 12180 1727204076.08381: Calling groups_inventory to load vars for managed-node1 12180 1727204076.08384: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204076.08395: Calling all_plugins_play to load vars for managed-node1 12180 1727204076.08397: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204076.08399: Calling groups_plugins_play to load vars for managed-node1 12180 1727204076.09354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204076.10282: done with get_vars() 12180 1727204076.10302: done getting variables 12180 1727204076.10352: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.330) 0:00:23.515 ***** 12180 1727204076.10376: entering _queue_task() for managed-node1/set_fact 12180 1727204076.10618: worker is 1 (out of 1 available) 12180 1727204076.10634: exiting _queue_task() for managed-node1/set_fact 12180 1727204076.10647: done queuing things up, now waiting for results queue to drain 12180 1727204076.10649: waiting for pending results... 12180 1727204076.10831: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 12180 1727204076.10914: in run() - task 0affcd87-79f5-ccb1-55ae-0000000003b5 12180 1727204076.10926: variable 'ansible_search_path' from source: unknown 12180 1727204076.10931: variable 'ansible_search_path' from source: unknown 12180 1727204076.10958: calling self._execute() 12180 1727204076.11036: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.11041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.11050: variable 'omit' from source: magic vars 12180 1727204076.11336: variable 'ansible_distribution_major_version' from source: facts 12180 1727204076.11345: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204076.11434: variable 'profile_stat' from source: set_fact 12180 1727204076.11444: Evaluated conditional (profile_stat.stat.exists): False 12180 1727204076.11447: when evaluation is False, skipping this task 12180 1727204076.11449: _execute() done 12180 1727204076.11452: dumping result to json 12180 1727204076.11454: done dumping result, returning 12180 1727204076.11461: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-ccb1-55ae-0000000003b5] 12180 1727204076.11468: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003b5 12180 1727204076.11560: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003b5 12180 1727204076.11563: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12180 1727204076.11614: no more pending results, returning what we have 12180 1727204076.11618: results queue empty 12180 1727204076.11619: checking for any_errors_fatal 12180 1727204076.11626: done checking for any_errors_fatal 12180 1727204076.11627: checking for max_fail_percentage 12180 1727204076.11631: done checking for max_fail_percentage 12180 1727204076.11632: checking to see if all hosts have failed and the running result is not ok 12180 1727204076.11632: done checking to see if all hosts have failed 12180 1727204076.11633: getting the remaining hosts for this loop 12180 1727204076.11634: done getting the remaining hosts for this loop 12180 1727204076.11638: getting the next task for host managed-node1 12180 1727204076.11645: done getting next task for host managed-node1 12180 1727204076.11648: ^ task is: TASK: Get NM profile info 12180 1727204076.11652: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204076.11658: getting variables 12180 1727204076.11660: in VariableManager get_vars() 12180 1727204076.11706: Calling all_inventory to load vars for managed-node1 12180 1727204076.11709: Calling groups_inventory to load vars for managed-node1 12180 1727204076.11711: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204076.11721: Calling all_plugins_play to load vars for managed-node1 12180 1727204076.11723: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204076.11725: Calling groups_plugins_play to load vars for managed-node1 12180 1727204076.12568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204076.13614: done with get_vars() 12180 1727204076.13634: done getting variables 12180 1727204076.13681: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.033) 0:00:23.549 ***** 12180 1727204076.13705: entering _queue_task() for managed-node1/shell 12180 1727204076.13948: worker is 1 (out of 1 available) 12180 1727204076.13961: exiting _queue_task() for managed-node1/shell 12180 1727204076.13975: done queuing things up, now waiting for results queue to drain 12180 1727204076.13977: waiting for pending results... 12180 1727204076.14155: running TaskExecutor() for managed-node1/TASK: Get NM profile info 12180 1727204076.14236: in run() - task 0affcd87-79f5-ccb1-55ae-0000000003b6 12180 1727204076.14247: variable 'ansible_search_path' from source: unknown 12180 1727204076.14251: variable 'ansible_search_path' from source: unknown 12180 1727204076.14284: calling self._execute() 12180 1727204076.14359: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.14363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.14374: variable 'omit' from source: magic vars 12180 1727204076.14649: variable 'ansible_distribution_major_version' from source: facts 12180 1727204076.14659: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204076.14667: variable 'omit' from source: magic vars 12180 1727204076.14696: variable 'omit' from source: magic vars 12180 1727204076.14773: variable 'profile' from source: include params 12180 1727204076.14777: variable 'item' from source: include params 12180 1727204076.14827: variable 'item' from source: include params 12180 1727204076.14839: variable 'omit' from source: magic vars 12180 1727204076.14875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204076.14902: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204076.14919: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204076.14938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204076.14942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204076.14971: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204076.14975: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.14977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.15046: Set connection var ansible_pipelining to False 12180 1727204076.15050: Set connection var ansible_shell_type to sh 12180 1727204076.15052: Set connection var ansible_timeout to 10 12180 1727204076.15054: Set connection var ansible_connection to ssh 12180 1727204076.15060: Set connection var ansible_shell_executable to /bin/sh 12180 1727204076.15066: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204076.15087: variable 'ansible_shell_executable' from source: unknown 12180 1727204076.15090: variable 'ansible_connection' from source: unknown 12180 1727204076.15092: variable 'ansible_module_compression' from source: unknown 12180 1727204076.15094: variable 'ansible_shell_type' from source: unknown 12180 1727204076.15097: variable 'ansible_shell_executable' from source: unknown 12180 1727204076.15099: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.15103: variable 'ansible_pipelining' from source: unknown 12180 1727204076.15106: variable 'ansible_timeout' from source: unknown 12180 1727204076.15110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.15212: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204076.15221: variable 'omit' from source: magic vars 12180 1727204076.15226: starting attempt loop 12180 1727204076.15230: running the handler 12180 1727204076.15238: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204076.15252: _low_level_execute_command(): starting 12180 1727204076.15263: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204076.15800: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204076.15817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204076.15879: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204076.15898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204076.15910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204076.15987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204076.17566: stdout chunk (state=3): >>>/root <<< 12180 1727204076.17668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204076.17726: stderr chunk (state=3): >>><<< 12180 1727204076.17734: stdout chunk (state=3): >>><<< 12180 1727204076.17755: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204076.17768: _low_level_execute_command(): starting 12180 1727204076.17774: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204076.1775398-14781-59410233173215 `" && echo ansible-tmp-1727204076.1775398-14781-59410233173215="` echo /root/.ansible/tmp/ansible-tmp-1727204076.1775398-14781-59410233173215 `" ) && sleep 0' 12180 1727204076.18251: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204076.18265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204076.18295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204076.18309: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 12180 1727204076.18320: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204076.18370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204076.18382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204076.18447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204076.20305: stdout chunk (state=3): >>>ansible-tmp-1727204076.1775398-14781-59410233173215=/root/.ansible/tmp/ansible-tmp-1727204076.1775398-14781-59410233173215 <<< 12180 1727204076.20413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204076.20479: stderr chunk (state=3): >>><<< 12180 1727204076.20484: stdout chunk (state=3): >>><<< 12180 1727204076.20500: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204076.1775398-14781-59410233173215=/root/.ansible/tmp/ansible-tmp-1727204076.1775398-14781-59410233173215 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204076.20528: variable 'ansible_module_compression' from source: unknown 12180 1727204076.20578: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12180 1727204076.20608: variable 'ansible_facts' from source: unknown 12180 1727204076.20675: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204076.1775398-14781-59410233173215/AnsiballZ_command.py 12180 1727204076.20792: Sending initial data 12180 1727204076.20795: Sent initial data (155 bytes) 12180 1727204076.21504: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204076.21508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204076.21550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204076.21567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204076.21619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204076.21624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204076.21701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204076.23430: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204076.23481: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204076.23537: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpjfylba4f /root/.ansible/tmp/ansible-tmp-1727204076.1775398-14781-59410233173215/AnsiballZ_command.py <<< 12180 1727204076.23585: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204076.24425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204076.24548: stderr chunk (state=3): >>><<< 12180 1727204076.24555: stdout chunk (state=3): >>><<< 12180 1727204076.24574: done transferring module to remote 12180 1727204076.24584: _low_level_execute_command(): starting 12180 1727204076.24589: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204076.1775398-14781-59410233173215/ /root/.ansible/tmp/ansible-tmp-1727204076.1775398-14781-59410233173215/AnsiballZ_command.py && sleep 0' 12180 1727204076.25053: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204076.25057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204076.25095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204076.25109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204076.25119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204076.25169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204076.25200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204076.25248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204076.26977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204076.27040: stderr chunk (state=3): >>><<< 12180 1727204076.27044: stdout chunk (state=3): >>><<< 12180 1727204076.27059: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204076.27062: _low_level_execute_command(): starting 12180 1727204076.27069: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204076.1775398-14781-59410233173215/AnsiballZ_command.py && sleep 0' 12180 1727204076.27546: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204076.27551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204076.27589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204076.27594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 12180 1727204076.27596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204076.27638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204076.27653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204076.27722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204076.55803: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:36.407037", "end": "2024-09-24 14:54:36.557378", "delta": "0:00:00.150341", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12180 1727204076.57081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204076.57139: stderr chunk (state=3): >>><<< 12180 1727204076.57145: stdout chunk (state=3): >>><<< 12180 1727204076.57160: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:36.407037", "end": "2024-09-24 14:54:36.557378", "delta": "0:00:00.150341", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204076.57192: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204076.1775398-14781-59410233173215/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204076.57202: _low_level_execute_command(): starting 12180 1727204076.57205: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204076.1775398-14781-59410233173215/ > /dev/null 2>&1 && sleep 0' 12180 1727204076.57692: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204076.57696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204076.57731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204076.57736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204076.57739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 12180 1727204076.57741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204076.57798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204076.57801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204076.57807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204076.57867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204076.59658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204076.59724: stderr chunk (state=3): >>><<< 12180 1727204076.59727: stdout chunk (state=3): >>><<< 12180 1727204076.59744: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204076.59750: handler run complete 12180 1727204076.59769: Evaluated conditional (False): False 12180 1727204076.59780: attempt loop complete, returning result 12180 1727204076.59783: _execute() done 12180 1727204076.59785: dumping result to json 12180 1727204076.59791: done dumping result, returning 12180 1727204076.59802: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [0affcd87-79f5-ccb1-55ae-0000000003b6] 12180 1727204076.59805: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003b6 12180 1727204076.59903: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003b6 12180 1727204076.59907: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.150341", "end": "2024-09-24 14:54:36.557378", "rc": 0, "start": "2024-09-24 14:54:36.407037" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 12180 1727204076.59979: no more pending results, returning what we have 12180 1727204076.59982: results queue empty 12180 1727204076.59983: checking for any_errors_fatal 12180 1727204076.59989: done checking for any_errors_fatal 12180 1727204076.59990: checking for max_fail_percentage 12180 1727204076.59991: done checking for max_fail_percentage 12180 1727204076.59992: checking to see if all hosts have failed and the running result is not ok 12180 1727204076.59993: done checking to see if all hosts have failed 12180 1727204076.59994: getting the remaining hosts for this loop 12180 1727204076.59995: done getting the remaining hosts for this loop 12180 1727204076.59999: getting the next task for host managed-node1 12180 1727204076.60005: done getting next task for host managed-node1 12180 1727204076.60008: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12180 1727204076.60013: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204076.60018: getting variables 12180 1727204076.60019: in VariableManager get_vars() 12180 1727204076.60059: Calling all_inventory to load vars for managed-node1 12180 1727204076.60061: Calling groups_inventory to load vars for managed-node1 12180 1727204076.60070: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204076.60081: Calling all_plugins_play to load vars for managed-node1 12180 1727204076.60084: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204076.60086: Calling groups_plugins_play to load vars for managed-node1 12180 1727204076.60905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204076.61850: done with get_vars() 12180 1727204076.61872: done getting variables 12180 1727204076.61919: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.482) 0:00:24.031 ***** 12180 1727204076.61945: entering _queue_task() for managed-node1/set_fact 12180 1727204076.62189: worker is 1 (out of 1 available) 12180 1727204076.62202: exiting _queue_task() for managed-node1/set_fact 12180 1727204076.62215: done queuing things up, now waiting for results queue to drain 12180 1727204076.62217: waiting for pending results... 12180 1727204076.62397: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12180 1727204076.62480: in run() - task 0affcd87-79f5-ccb1-55ae-0000000003b7 12180 1727204076.62493: variable 'ansible_search_path' from source: unknown 12180 1727204076.62496: variable 'ansible_search_path' from source: unknown 12180 1727204076.62524: calling self._execute() 12180 1727204076.62607: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.62611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.62619: variable 'omit' from source: magic vars 12180 1727204076.62897: variable 'ansible_distribution_major_version' from source: facts 12180 1727204076.62908: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204076.63003: variable 'nm_profile_exists' from source: set_fact 12180 1727204076.63012: Evaluated conditional (nm_profile_exists.rc == 0): True 12180 1727204076.63017: variable 'omit' from source: magic vars 12180 1727204076.63050: variable 'omit' from source: magic vars 12180 1727204076.63074: variable 'omit' from source: magic vars 12180 1727204076.63111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204076.63137: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204076.63154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204076.63168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204076.63178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204076.63204: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204076.63208: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.63212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.63283: Set connection var ansible_pipelining to False 12180 1727204076.63286: Set connection var ansible_shell_type to sh 12180 1727204076.63291: Set connection var ansible_timeout to 10 12180 1727204076.63296: Set connection var ansible_connection to ssh 12180 1727204076.63301: Set connection var ansible_shell_executable to /bin/sh 12180 1727204076.63305: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204076.63332: variable 'ansible_shell_executable' from source: unknown 12180 1727204076.63335: variable 'ansible_connection' from source: unknown 12180 1727204076.63337: variable 'ansible_module_compression' from source: unknown 12180 1727204076.63339: variable 'ansible_shell_type' from source: unknown 12180 1727204076.63342: variable 'ansible_shell_executable' from source: unknown 12180 1727204076.63344: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.63346: variable 'ansible_pipelining' from source: unknown 12180 1727204076.63348: variable 'ansible_timeout' from source: unknown 12180 1727204076.63350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.63452: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204076.63461: variable 'omit' from source: magic vars 12180 1727204076.63467: starting attempt loop 12180 1727204076.63472: running the handler 12180 1727204076.63482: handler run complete 12180 1727204076.63490: attempt loop complete, returning result 12180 1727204076.63493: _execute() done 12180 1727204076.63495: dumping result to json 12180 1727204076.63498: done dumping result, returning 12180 1727204076.63505: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-ccb1-55ae-0000000003b7] 12180 1727204076.63509: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003b7 12180 1727204076.63600: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003b7 12180 1727204076.63603: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 12180 1727204076.63655: no more pending results, returning what we have 12180 1727204076.63658: results queue empty 12180 1727204076.63659: checking for any_errors_fatal 12180 1727204076.63670: done checking for any_errors_fatal 12180 1727204076.63671: checking for max_fail_percentage 12180 1727204076.63673: done checking for max_fail_percentage 12180 1727204076.63674: checking to see if all hosts have failed and the running result is not ok 12180 1727204076.63680: done checking to see if all hosts have failed 12180 1727204076.63681: getting the remaining hosts for this loop 12180 1727204076.63682: done getting the remaining hosts for this loop 12180 1727204076.63687: getting the next task for host managed-node1 12180 1727204076.63696: done getting next task for host managed-node1 12180 1727204076.63699: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 12180 1727204076.63703: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204076.63706: getting variables 12180 1727204076.63708: in VariableManager get_vars() 12180 1727204076.63752: Calling all_inventory to load vars for managed-node1 12180 1727204076.63755: Calling groups_inventory to load vars for managed-node1 12180 1727204076.63757: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204076.63768: Calling all_plugins_play to load vars for managed-node1 12180 1727204076.63771: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204076.63773: Calling groups_plugins_play to load vars for managed-node1 12180 1727204076.64742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204076.65668: done with get_vars() 12180 1727204076.65686: done getting variables 12180 1727204076.65733: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204076.65824: variable 'profile' from source: include params 12180 1727204076.65827: variable 'item' from source: include params 12180 1727204076.65875: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.039) 0:00:24.070 ***** 12180 1727204076.65903: entering _queue_task() for managed-node1/command 12180 1727204076.66143: worker is 1 (out of 1 available) 12180 1727204076.66157: exiting _queue_task() for managed-node1/command 12180 1727204076.66169: done queuing things up, now waiting for results queue to drain 12180 1727204076.66171: waiting for pending results... 12180 1727204076.66351: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0 12180 1727204076.66438: in run() - task 0affcd87-79f5-ccb1-55ae-0000000003b9 12180 1727204076.66449: variable 'ansible_search_path' from source: unknown 12180 1727204076.66453: variable 'ansible_search_path' from source: unknown 12180 1727204076.66482: calling self._execute() 12180 1727204076.66556: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.66559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.66569: variable 'omit' from source: magic vars 12180 1727204076.66835: variable 'ansible_distribution_major_version' from source: facts 12180 1727204076.66847: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204076.66934: variable 'profile_stat' from source: set_fact 12180 1727204076.66947: Evaluated conditional (profile_stat.stat.exists): False 12180 1727204076.66952: when evaluation is False, skipping this task 12180 1727204076.66955: _execute() done 12180 1727204076.66957: dumping result to json 12180 1727204076.66959: done dumping result, returning 12180 1727204076.66962: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0 [0affcd87-79f5-ccb1-55ae-0000000003b9] 12180 1727204076.66971: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003b9 12180 1727204076.67056: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003b9 12180 1727204076.67058: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12180 1727204076.67122: no more pending results, returning what we have 12180 1727204076.67126: results queue empty 12180 1727204076.67127: checking for any_errors_fatal 12180 1727204076.67133: done checking for any_errors_fatal 12180 1727204076.67134: checking for max_fail_percentage 12180 1727204076.67136: done checking for max_fail_percentage 12180 1727204076.67137: checking to see if all hosts have failed and the running result is not ok 12180 1727204076.67137: done checking to see if all hosts have failed 12180 1727204076.67138: getting the remaining hosts for this loop 12180 1727204076.67139: done getting the remaining hosts for this loop 12180 1727204076.67143: getting the next task for host managed-node1 12180 1727204076.67149: done getting next task for host managed-node1 12180 1727204076.67152: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 12180 1727204076.67156: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204076.67160: getting variables 12180 1727204076.67161: in VariableManager get_vars() 12180 1727204076.67204: Calling all_inventory to load vars for managed-node1 12180 1727204076.67208: Calling groups_inventory to load vars for managed-node1 12180 1727204076.67210: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204076.67220: Calling all_plugins_play to load vars for managed-node1 12180 1727204076.67222: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204076.67225: Calling groups_plugins_play to load vars for managed-node1 12180 1727204076.68026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204076.68947: done with get_vars() 12180 1727204076.68969: done getting variables 12180 1727204076.69016: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204076.69103: variable 'profile' from source: include params 12180 1727204076.69106: variable 'item' from source: include params 12180 1727204076.69150: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.032) 0:00:24.103 ***** 12180 1727204076.69176: entering _queue_task() for managed-node1/set_fact 12180 1727204076.69414: worker is 1 (out of 1 available) 12180 1727204076.69428: exiting _queue_task() for managed-node1/set_fact 12180 1727204076.69440: done queuing things up, now waiting for results queue to drain 12180 1727204076.69442: waiting for pending results... 12180 1727204076.69623: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 12180 1727204076.69713: in run() - task 0affcd87-79f5-ccb1-55ae-0000000003ba 12180 1727204076.69727: variable 'ansible_search_path' from source: unknown 12180 1727204076.69733: variable 'ansible_search_path' from source: unknown 12180 1727204076.69759: calling self._execute() 12180 1727204076.69835: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.69838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.69846: variable 'omit' from source: magic vars 12180 1727204076.70114: variable 'ansible_distribution_major_version' from source: facts 12180 1727204076.70125: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204076.70213: variable 'profile_stat' from source: set_fact 12180 1727204076.70224: Evaluated conditional (profile_stat.stat.exists): False 12180 1727204076.70227: when evaluation is False, skipping this task 12180 1727204076.70233: _execute() done 12180 1727204076.70236: dumping result to json 12180 1727204076.70239: done dumping result, returning 12180 1727204076.70242: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0affcd87-79f5-ccb1-55ae-0000000003ba] 12180 1727204076.70247: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003ba 12180 1727204076.70337: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003ba 12180 1727204076.70340: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12180 1727204076.70389: no more pending results, returning what we have 12180 1727204076.70393: results queue empty 12180 1727204076.70394: checking for any_errors_fatal 12180 1727204076.70402: done checking for any_errors_fatal 12180 1727204076.70402: checking for max_fail_percentage 12180 1727204076.70404: done checking for max_fail_percentage 12180 1727204076.70405: checking to see if all hosts have failed and the running result is not ok 12180 1727204076.70406: done checking to see if all hosts have failed 12180 1727204076.70406: getting the remaining hosts for this loop 12180 1727204076.70408: done getting the remaining hosts for this loop 12180 1727204076.70412: getting the next task for host managed-node1 12180 1727204076.70419: done getting next task for host managed-node1 12180 1727204076.70421: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 12180 1727204076.70426: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204076.70433: getting variables 12180 1727204076.70435: in VariableManager get_vars() 12180 1727204076.70482: Calling all_inventory to load vars for managed-node1 12180 1727204076.70485: Calling groups_inventory to load vars for managed-node1 12180 1727204076.70486: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204076.70497: Calling all_plugins_play to load vars for managed-node1 12180 1727204076.70499: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204076.70502: Calling groups_plugins_play to load vars for managed-node1 12180 1727204076.71452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204076.72377: done with get_vars() 12180 1727204076.72396: done getting variables 12180 1727204076.72447: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204076.72537: variable 'profile' from source: include params 12180 1727204076.72540: variable 'item' from source: include params 12180 1727204076.72582: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.034) 0:00:24.138 ***** 12180 1727204076.72606: entering _queue_task() for managed-node1/command 12180 1727204076.72849: worker is 1 (out of 1 available) 12180 1727204076.72862: exiting _queue_task() for managed-node1/command 12180 1727204076.73019: done queuing things up, now waiting for results queue to drain 12180 1727204076.73022: waiting for pending results... 12180 1727204076.73137: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0 12180 1727204076.73225: in run() - task 0affcd87-79f5-ccb1-55ae-0000000003bb 12180 1727204076.73236: variable 'ansible_search_path' from source: unknown 12180 1727204076.73240: variable 'ansible_search_path' from source: unknown 12180 1727204076.73271: calling self._execute() 12180 1727204076.73345: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.73349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.73359: variable 'omit' from source: magic vars 12180 1727204076.73625: variable 'ansible_distribution_major_version' from source: facts 12180 1727204076.73634: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204076.73719: variable 'profile_stat' from source: set_fact 12180 1727204076.73734: Evaluated conditional (profile_stat.stat.exists): False 12180 1727204076.73738: when evaluation is False, skipping this task 12180 1727204076.73741: _execute() done 12180 1727204076.73743: dumping result to json 12180 1727204076.73746: done dumping result, returning 12180 1727204076.73749: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0 [0affcd87-79f5-ccb1-55ae-0000000003bb] 12180 1727204076.73754: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003bb 12180 1727204076.73844: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003bb 12180 1727204076.73847: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12180 1727204076.73900: no more pending results, returning what we have 12180 1727204076.73904: results queue empty 12180 1727204076.73905: checking for any_errors_fatal 12180 1727204076.73911: done checking for any_errors_fatal 12180 1727204076.73911: checking for max_fail_percentage 12180 1727204076.73913: done checking for max_fail_percentage 12180 1727204076.73914: checking to see if all hosts have failed and the running result is not ok 12180 1727204076.73915: done checking to see if all hosts have failed 12180 1727204076.73915: getting the remaining hosts for this loop 12180 1727204076.73916: done getting the remaining hosts for this loop 12180 1727204076.73920: getting the next task for host managed-node1 12180 1727204076.73927: done getting next task for host managed-node1 12180 1727204076.73932: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 12180 1727204076.73937: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204076.73941: getting variables 12180 1727204076.73942: in VariableManager get_vars() 12180 1727204076.73990: Calling all_inventory to load vars for managed-node1 12180 1727204076.73993: Calling groups_inventory to load vars for managed-node1 12180 1727204076.73994: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204076.74005: Calling all_plugins_play to load vars for managed-node1 12180 1727204076.74008: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204076.74011: Calling groups_plugins_play to load vars for managed-node1 12180 1727204076.75328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204076.76686: done with get_vars() 12180 1727204076.76706: done getting variables 12180 1727204076.76754: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204076.76842: variable 'profile' from source: include params 12180 1727204076.76845: variable 'item' from source: include params 12180 1727204076.76888: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.043) 0:00:24.181 ***** 12180 1727204076.76911: entering _queue_task() for managed-node1/set_fact 12180 1727204076.77145: worker is 1 (out of 1 available) 12180 1727204076.77158: exiting _queue_task() for managed-node1/set_fact 12180 1727204076.77173: done queuing things up, now waiting for results queue to drain 12180 1727204076.77175: waiting for pending results... 12180 1727204076.77356: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0 12180 1727204076.77445: in run() - task 0affcd87-79f5-ccb1-55ae-0000000003bc 12180 1727204076.77455: variable 'ansible_search_path' from source: unknown 12180 1727204076.77459: variable 'ansible_search_path' from source: unknown 12180 1727204076.77490: calling self._execute() 12180 1727204076.77570: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.77574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.77585: variable 'omit' from source: magic vars 12180 1727204076.77854: variable 'ansible_distribution_major_version' from source: facts 12180 1727204076.77867: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204076.77954: variable 'profile_stat' from source: set_fact 12180 1727204076.77967: Evaluated conditional (profile_stat.stat.exists): False 12180 1727204076.77970: when evaluation is False, skipping this task 12180 1727204076.77972: _execute() done 12180 1727204076.77975: dumping result to json 12180 1727204076.77977: done dumping result, returning 12180 1727204076.77984: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0 [0affcd87-79f5-ccb1-55ae-0000000003bc] 12180 1727204076.77989: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003bc 12180 1727204076.78079: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003bc 12180 1727204076.78082: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12180 1727204076.78130: no more pending results, returning what we have 12180 1727204076.78134: results queue empty 12180 1727204076.78135: checking for any_errors_fatal 12180 1727204076.78140: done checking for any_errors_fatal 12180 1727204076.78141: checking for max_fail_percentage 12180 1727204076.78143: done checking for max_fail_percentage 12180 1727204076.78144: checking to see if all hosts have failed and the running result is not ok 12180 1727204076.78145: done checking to see if all hosts have failed 12180 1727204076.78145: getting the remaining hosts for this loop 12180 1727204076.78146: done getting the remaining hosts for this loop 12180 1727204076.78150: getting the next task for host managed-node1 12180 1727204076.78159: done getting next task for host managed-node1 12180 1727204076.78161: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 12180 1727204076.78167: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204076.78172: getting variables 12180 1727204076.78173: in VariableManager get_vars() 12180 1727204076.78220: Calling all_inventory to load vars for managed-node1 12180 1727204076.78222: Calling groups_inventory to load vars for managed-node1 12180 1727204076.78224: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204076.78235: Calling all_plugins_play to load vars for managed-node1 12180 1727204076.78237: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204076.78239: Calling groups_plugins_play to load vars for managed-node1 12180 1727204076.79772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204076.81927: done with get_vars() 12180 1727204076.81960: done getting variables 12180 1727204076.82018: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204076.82140: variable 'profile' from source: include params 12180 1727204076.82144: variable 'item' from source: include params 12180 1727204076.82212: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.053) 0:00:24.234 ***** 12180 1727204076.82245: entering _queue_task() for managed-node1/assert 12180 1727204076.82594: worker is 1 (out of 1 available) 12180 1727204076.82606: exiting _queue_task() for managed-node1/assert 12180 1727204076.82618: done queuing things up, now waiting for results queue to drain 12180 1727204076.82620: waiting for pending results... 12180 1727204076.82920: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0' 12180 1727204076.83043: in run() - task 0affcd87-79f5-ccb1-55ae-000000000261 12180 1727204076.83071: variable 'ansible_search_path' from source: unknown 12180 1727204076.83080: variable 'ansible_search_path' from source: unknown 12180 1727204076.83123: calling self._execute() 12180 1727204076.83239: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.83250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.83267: variable 'omit' from source: magic vars 12180 1727204076.83640: variable 'ansible_distribution_major_version' from source: facts 12180 1727204076.83658: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204076.83672: variable 'omit' from source: magic vars 12180 1727204076.83724: variable 'omit' from source: magic vars 12180 1727204076.83834: variable 'profile' from source: include params 12180 1727204076.83844: variable 'item' from source: include params 12180 1727204076.83911: variable 'item' from source: include params 12180 1727204076.83938: variable 'omit' from source: magic vars 12180 1727204076.83986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204076.84028: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204076.84060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204076.84084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204076.84101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204076.84138: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204076.84152: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.84161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.84272: Set connection var ansible_pipelining to False 12180 1727204076.84280: Set connection var ansible_shell_type to sh 12180 1727204076.84292: Set connection var ansible_timeout to 10 12180 1727204076.84302: Set connection var ansible_connection to ssh 12180 1727204076.84312: Set connection var ansible_shell_executable to /bin/sh 12180 1727204076.84322: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204076.84353: variable 'ansible_shell_executable' from source: unknown 12180 1727204076.84363: variable 'ansible_connection' from source: unknown 12180 1727204076.84375: variable 'ansible_module_compression' from source: unknown 12180 1727204076.84382: variable 'ansible_shell_type' from source: unknown 12180 1727204076.84389: variable 'ansible_shell_executable' from source: unknown 12180 1727204076.84395: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.84403: variable 'ansible_pipelining' from source: unknown 12180 1727204076.84411: variable 'ansible_timeout' from source: unknown 12180 1727204076.84418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.84565: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204076.84585: variable 'omit' from source: magic vars 12180 1727204076.84598: starting attempt loop 12180 1727204076.84605: running the handler 12180 1727204076.84737: variable 'lsr_net_profile_exists' from source: set_fact 12180 1727204076.84748: Evaluated conditional (lsr_net_profile_exists): True 12180 1727204076.84758: handler run complete 12180 1727204076.84779: attempt loop complete, returning result 12180 1727204076.84785: _execute() done 12180 1727204076.84791: dumping result to json 12180 1727204076.84798: done dumping result, returning 12180 1727204076.84812: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0' [0affcd87-79f5-ccb1-55ae-000000000261] 12180 1727204076.84821: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000261 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12180 1727204076.84971: no more pending results, returning what we have 12180 1727204076.84975: results queue empty 12180 1727204076.84976: checking for any_errors_fatal 12180 1727204076.84982: done checking for any_errors_fatal 12180 1727204076.84983: checking for max_fail_percentage 12180 1727204076.84985: done checking for max_fail_percentage 12180 1727204076.84986: checking to see if all hosts have failed and the running result is not ok 12180 1727204076.84987: done checking to see if all hosts have failed 12180 1727204076.84988: getting the remaining hosts for this loop 12180 1727204076.84989: done getting the remaining hosts for this loop 12180 1727204076.84993: getting the next task for host managed-node1 12180 1727204076.85001: done getting next task for host managed-node1 12180 1727204076.85003: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 12180 1727204076.85006: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204076.85011: getting variables 12180 1727204076.85013: in VariableManager get_vars() 12180 1727204076.85058: Calling all_inventory to load vars for managed-node1 12180 1727204076.85062: Calling groups_inventory to load vars for managed-node1 12180 1727204076.85066: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204076.85078: Calling all_plugins_play to load vars for managed-node1 12180 1727204076.85081: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204076.85084: Calling groups_plugins_play to load vars for managed-node1 12180 1727204076.86082: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000261 12180 1727204076.86086: WORKER PROCESS EXITING 12180 1727204076.86307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204076.87369: done with get_vars() 12180 1727204076.87402: done getting variables 12180 1727204076.87485: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204076.87660: variable 'profile' from source: include params 12180 1727204076.87667: variable 'item' from source: include params 12180 1727204076.87736: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.055) 0:00:24.289 ***** 12180 1727204076.87776: entering _queue_task() for managed-node1/assert 12180 1727204076.88117: worker is 1 (out of 1 available) 12180 1727204076.88137: exiting _queue_task() for managed-node1/assert 12180 1727204076.88149: done queuing things up, now waiting for results queue to drain 12180 1727204076.88151: waiting for pending results... 12180 1727204076.88586: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0' 12180 1727204076.88685: in run() - task 0affcd87-79f5-ccb1-55ae-000000000262 12180 1727204076.88706: variable 'ansible_search_path' from source: unknown 12180 1727204076.88710: variable 'ansible_search_path' from source: unknown 12180 1727204076.88739: calling self._execute() 12180 1727204076.88816: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.88819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.88831: variable 'omit' from source: magic vars 12180 1727204076.89100: variable 'ansible_distribution_major_version' from source: facts 12180 1727204076.89111: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204076.89116: variable 'omit' from source: magic vars 12180 1727204076.89148: variable 'omit' from source: magic vars 12180 1727204076.89223: variable 'profile' from source: include params 12180 1727204076.89231: variable 'item' from source: include params 12180 1727204076.89276: variable 'item' from source: include params 12180 1727204076.89293: variable 'omit' from source: magic vars 12180 1727204076.89327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204076.89357: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204076.89376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204076.89389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204076.89400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204076.89424: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204076.89427: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.89432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.89506: Set connection var ansible_pipelining to False 12180 1727204076.89510: Set connection var ansible_shell_type to sh 12180 1727204076.89512: Set connection var ansible_timeout to 10 12180 1727204076.89518: Set connection var ansible_connection to ssh 12180 1727204076.89522: Set connection var ansible_shell_executable to /bin/sh 12180 1727204076.89531: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204076.89550: variable 'ansible_shell_executable' from source: unknown 12180 1727204076.89553: variable 'ansible_connection' from source: unknown 12180 1727204076.89556: variable 'ansible_module_compression' from source: unknown 12180 1727204076.89560: variable 'ansible_shell_type' from source: unknown 12180 1727204076.89562: variable 'ansible_shell_executable' from source: unknown 12180 1727204076.89564: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.89571: variable 'ansible_pipelining' from source: unknown 12180 1727204076.89574: variable 'ansible_timeout' from source: unknown 12180 1727204076.89576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.89678: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204076.89688: variable 'omit' from source: magic vars 12180 1727204076.89691: starting attempt loop 12180 1727204076.89694: running the handler 12180 1727204076.89776: variable 'lsr_net_profile_ansible_managed' from source: set_fact 12180 1727204076.89779: Evaluated conditional (lsr_net_profile_ansible_managed): True 12180 1727204076.89785: handler run complete 12180 1727204076.89798: attempt loop complete, returning result 12180 1727204076.89801: _execute() done 12180 1727204076.89805: dumping result to json 12180 1727204076.89808: done dumping result, returning 12180 1727204076.89811: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0' [0affcd87-79f5-ccb1-55ae-000000000262] 12180 1727204076.89817: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000262 12180 1727204076.89906: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000262 12180 1727204076.89909: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12180 1727204076.89969: no more pending results, returning what we have 12180 1727204076.89973: results queue empty 12180 1727204076.89974: checking for any_errors_fatal 12180 1727204076.89980: done checking for any_errors_fatal 12180 1727204076.89981: checking for max_fail_percentage 12180 1727204076.89983: done checking for max_fail_percentage 12180 1727204076.89984: checking to see if all hosts have failed and the running result is not ok 12180 1727204076.89985: done checking to see if all hosts have failed 12180 1727204076.89986: getting the remaining hosts for this loop 12180 1727204076.89987: done getting the remaining hosts for this loop 12180 1727204076.89990: getting the next task for host managed-node1 12180 1727204076.89996: done getting next task for host managed-node1 12180 1727204076.89999: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 12180 1727204076.90002: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204076.90007: getting variables 12180 1727204076.90009: in VariableManager get_vars() 12180 1727204076.90050: Calling all_inventory to load vars for managed-node1 12180 1727204076.90053: Calling groups_inventory to load vars for managed-node1 12180 1727204076.90055: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204076.90066: Calling all_plugins_play to load vars for managed-node1 12180 1727204076.90069: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204076.90071: Calling groups_plugins_play to load vars for managed-node1 12180 1727204076.91362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204076.93220: done with get_vars() 12180 1727204076.93255: done getting variables 12180 1727204076.93334: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204076.93468: variable 'profile' from source: include params 12180 1727204076.93472: variable 'item' from source: include params 12180 1727204076.93546: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:36 -0400 (0:00:00.058) 0:00:24.347 ***** 12180 1727204076.93586: entering _queue_task() for managed-node1/assert 12180 1727204076.93943: worker is 1 (out of 1 available) 12180 1727204076.93955: exiting _queue_task() for managed-node1/assert 12180 1727204076.93973: done queuing things up, now waiting for results queue to drain 12180 1727204076.93978: waiting for pending results... 12180 1727204076.94276: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0 12180 1727204076.94399: in run() - task 0affcd87-79f5-ccb1-55ae-000000000263 12180 1727204076.94431: variable 'ansible_search_path' from source: unknown 12180 1727204076.94440: variable 'ansible_search_path' from source: unknown 12180 1727204076.94484: calling self._execute() 12180 1727204076.94601: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.94613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.94641: variable 'omit' from source: magic vars 12180 1727204076.95043: variable 'ansible_distribution_major_version' from source: facts 12180 1727204076.95068: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204076.95085: variable 'omit' from source: magic vars 12180 1727204076.95132: variable 'omit' from source: magic vars 12180 1727204076.95253: variable 'profile' from source: include params 12180 1727204076.95263: variable 'item' from source: include params 12180 1727204076.95344: variable 'item' from source: include params 12180 1727204076.95371: variable 'omit' from source: magic vars 12180 1727204076.95431: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204076.95478: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204076.95516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204076.95543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204076.95561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204076.95598: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204076.95614: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.95626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.95746: Set connection var ansible_pipelining to False 12180 1727204076.95755: Set connection var ansible_shell_type to sh 12180 1727204076.95767: Set connection var ansible_timeout to 10 12180 1727204076.95777: Set connection var ansible_connection to ssh 12180 1727204076.95785: Set connection var ansible_shell_executable to /bin/sh 12180 1727204076.95794: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204076.95833: variable 'ansible_shell_executable' from source: unknown 12180 1727204076.95846: variable 'ansible_connection' from source: unknown 12180 1727204076.95853: variable 'ansible_module_compression' from source: unknown 12180 1727204076.95860: variable 'ansible_shell_type' from source: unknown 12180 1727204076.95869: variable 'ansible_shell_executable' from source: unknown 12180 1727204076.95876: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204076.95884: variable 'ansible_pipelining' from source: unknown 12180 1727204076.95890: variable 'ansible_timeout' from source: unknown 12180 1727204076.95897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204076.96053: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204076.96077: variable 'omit' from source: magic vars 12180 1727204076.96088: starting attempt loop 12180 1727204076.96094: running the handler 12180 1727204076.96220: variable 'lsr_net_profile_fingerprint' from source: set_fact 12180 1727204076.96234: Evaluated conditional (lsr_net_profile_fingerprint): True 12180 1727204076.96246: handler run complete 12180 1727204076.96276: attempt loop complete, returning result 12180 1727204076.96287: _execute() done 12180 1727204076.96294: dumping result to json 12180 1727204076.96303: done dumping result, returning 12180 1727204076.96314: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0 [0affcd87-79f5-ccb1-55ae-000000000263] 12180 1727204076.96324: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000263 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12180 1727204076.96490: no more pending results, returning what we have 12180 1727204076.96494: results queue empty 12180 1727204076.96496: checking for any_errors_fatal 12180 1727204076.96502: done checking for any_errors_fatal 12180 1727204076.96503: checking for max_fail_percentage 12180 1727204076.96505: done checking for max_fail_percentage 12180 1727204076.96506: checking to see if all hosts have failed and the running result is not ok 12180 1727204076.96507: done checking to see if all hosts have failed 12180 1727204076.96508: getting the remaining hosts for this loop 12180 1727204076.96510: done getting the remaining hosts for this loop 12180 1727204076.96514: getting the next task for host managed-node1 12180 1727204076.96525: done getting next task for host managed-node1 12180 1727204076.96528: ^ task is: TASK: Include the task 'get_profile_stat.yml' 12180 1727204076.96535: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204076.96540: getting variables 12180 1727204076.96543: in VariableManager get_vars() 12180 1727204076.96592: Calling all_inventory to load vars for managed-node1 12180 1727204076.96596: Calling groups_inventory to load vars for managed-node1 12180 1727204076.96598: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204076.96611: Calling all_plugins_play to load vars for managed-node1 12180 1727204076.96614: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204076.96618: Calling groups_plugins_play to load vars for managed-node1 12180 1727204076.97616: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000263 12180 1727204076.97620: WORKER PROCESS EXITING 12180 1727204076.98749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204077.00545: done with get_vars() 12180 1727204077.00585: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.071) 0:00:24.419 ***** 12180 1727204077.00715: entering _queue_task() for managed-node1/include_tasks 12180 1727204077.01080: worker is 1 (out of 1 available) 12180 1727204077.01093: exiting _queue_task() for managed-node1/include_tasks 12180 1727204077.01105: done queuing things up, now waiting for results queue to drain 12180 1727204077.01107: waiting for pending results... 12180 1727204077.01421: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 12180 1727204077.01578: in run() - task 0affcd87-79f5-ccb1-55ae-000000000267 12180 1727204077.01599: variable 'ansible_search_path' from source: unknown 12180 1727204077.01609: variable 'ansible_search_path' from source: unknown 12180 1727204077.01659: calling self._execute() 12180 1727204077.01769: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204077.01789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204077.01806: variable 'omit' from source: magic vars 12180 1727204077.02218: variable 'ansible_distribution_major_version' from source: facts 12180 1727204077.02243: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204077.02255: _execute() done 12180 1727204077.02263: dumping result to json 12180 1727204077.02273: done dumping result, returning 12180 1727204077.02282: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-ccb1-55ae-000000000267] 12180 1727204077.02292: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000267 12180 1727204077.02449: no more pending results, returning what we have 12180 1727204077.02458: in VariableManager get_vars() 12180 1727204077.02512: Calling all_inventory to load vars for managed-node1 12180 1727204077.02515: Calling groups_inventory to load vars for managed-node1 12180 1727204077.02518: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204077.02535: Calling all_plugins_play to load vars for managed-node1 12180 1727204077.02538: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204077.02541: Calling groups_plugins_play to load vars for managed-node1 12180 1727204077.04574: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000267 12180 1727204077.04578: WORKER PROCESS EXITING 12180 1727204077.05428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204077.08266: done with get_vars() 12180 1727204077.08298: variable 'ansible_search_path' from source: unknown 12180 1727204077.08299: variable 'ansible_search_path' from source: unknown 12180 1727204077.08342: we have included files to process 12180 1727204077.08343: generating all_blocks data 12180 1727204077.08345: done generating all_blocks data 12180 1727204077.08351: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12180 1727204077.08352: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12180 1727204077.08470: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12180 1727204077.10481: done processing included file 12180 1727204077.10484: iterating over new_blocks loaded from include file 12180 1727204077.10486: in VariableManager get_vars() 12180 1727204077.10509: done with get_vars() 12180 1727204077.10511: filtering new block on tags 12180 1727204077.10658: done filtering new block on tags 12180 1727204077.10662: in VariableManager get_vars() 12180 1727204077.10685: done with get_vars() 12180 1727204077.10688: filtering new block on tags 12180 1727204077.10712: done filtering new block on tags 12180 1727204077.10715: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 12180 1727204077.10720: extending task lists for all hosts with included blocks 12180 1727204077.11046: done extending task lists 12180 1727204077.11048: done processing included files 12180 1727204077.11049: results queue empty 12180 1727204077.11050: checking for any_errors_fatal 12180 1727204077.11053: done checking for any_errors_fatal 12180 1727204077.11054: checking for max_fail_percentage 12180 1727204077.11055: done checking for max_fail_percentage 12180 1727204077.11056: checking to see if all hosts have failed and the running result is not ok 12180 1727204077.11057: done checking to see if all hosts have failed 12180 1727204077.11057: getting the remaining hosts for this loop 12180 1727204077.11059: done getting the remaining hosts for this loop 12180 1727204077.11061: getting the next task for host managed-node1 12180 1727204077.11067: done getting next task for host managed-node1 12180 1727204077.11070: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 12180 1727204077.11073: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204077.11076: getting variables 12180 1727204077.11077: in VariableManager get_vars() 12180 1727204077.11097: Calling all_inventory to load vars for managed-node1 12180 1727204077.11100: Calling groups_inventory to load vars for managed-node1 12180 1727204077.11103: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204077.11108: Calling all_plugins_play to load vars for managed-node1 12180 1727204077.11110: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204077.11113: Calling groups_plugins_play to load vars for managed-node1 12180 1727204077.12584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204077.15463: done with get_vars() 12180 1727204077.15497: done getting variables 12180 1727204077.15547: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.148) 0:00:24.568 ***** 12180 1727204077.15606: entering _queue_task() for managed-node1/set_fact 12180 1727204077.15948: worker is 1 (out of 1 available) 12180 1727204077.15961: exiting _queue_task() for managed-node1/set_fact 12180 1727204077.15976: done queuing things up, now waiting for results queue to drain 12180 1727204077.15978: waiting for pending results... 12180 1727204077.16300: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 12180 1727204077.16623: in run() - task 0affcd87-79f5-ccb1-55ae-0000000003fb 12180 1727204077.16773: variable 'ansible_search_path' from source: unknown 12180 1727204077.16782: variable 'ansible_search_path' from source: unknown 12180 1727204077.16833: calling self._execute() 12180 1727204077.17102: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204077.17119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204077.17136: variable 'omit' from source: magic vars 12180 1727204077.18132: variable 'ansible_distribution_major_version' from source: facts 12180 1727204077.18160: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204077.18182: variable 'omit' from source: magic vars 12180 1727204077.18246: variable 'omit' from source: magic vars 12180 1727204077.18290: variable 'omit' from source: magic vars 12180 1727204077.18348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204077.18391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204077.18426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204077.18452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204077.18469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204077.18502: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204077.18511: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204077.18523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204077.18638: Set connection var ansible_pipelining to False 12180 1727204077.18648: Set connection var ansible_shell_type to sh 12180 1727204077.18660: Set connection var ansible_timeout to 10 12180 1727204077.18673: Set connection var ansible_connection to ssh 12180 1727204077.18683: Set connection var ansible_shell_executable to /bin/sh 12180 1727204077.18692: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204077.18723: variable 'ansible_shell_executable' from source: unknown 12180 1727204077.18740: variable 'ansible_connection' from source: unknown 12180 1727204077.18749: variable 'ansible_module_compression' from source: unknown 12180 1727204077.18756: variable 'ansible_shell_type' from source: unknown 12180 1727204077.18766: variable 'ansible_shell_executable' from source: unknown 12180 1727204077.18774: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204077.18782: variable 'ansible_pipelining' from source: unknown 12180 1727204077.18789: variable 'ansible_timeout' from source: unknown 12180 1727204077.18797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204077.18963: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204077.18983: variable 'omit' from source: magic vars 12180 1727204077.18995: starting attempt loop 12180 1727204077.19001: running the handler 12180 1727204077.19018: handler run complete 12180 1727204077.19036: attempt loop complete, returning result 12180 1727204077.19043: _execute() done 12180 1727204077.19049: dumping result to json 12180 1727204077.19058: done dumping result, returning 12180 1727204077.19078: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-ccb1-55ae-0000000003fb] 12180 1727204077.19087: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003fb ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 12180 1727204077.19259: no more pending results, returning what we have 12180 1727204077.19263: results queue empty 12180 1727204077.19266: checking for any_errors_fatal 12180 1727204077.19268: done checking for any_errors_fatal 12180 1727204077.19269: checking for max_fail_percentage 12180 1727204077.19271: done checking for max_fail_percentage 12180 1727204077.19272: checking to see if all hosts have failed and the running result is not ok 12180 1727204077.19273: done checking to see if all hosts have failed 12180 1727204077.19273: getting the remaining hosts for this loop 12180 1727204077.19275: done getting the remaining hosts for this loop 12180 1727204077.19279: getting the next task for host managed-node1 12180 1727204077.19287: done getting next task for host managed-node1 12180 1727204077.19290: ^ task is: TASK: Stat profile file 12180 1727204077.19295: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204077.19300: getting variables 12180 1727204077.19302: in VariableManager get_vars() 12180 1727204077.19353: Calling all_inventory to load vars for managed-node1 12180 1727204077.19356: Calling groups_inventory to load vars for managed-node1 12180 1727204077.19359: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204077.19373: Calling all_plugins_play to load vars for managed-node1 12180 1727204077.19375: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204077.19378: Calling groups_plugins_play to load vars for managed-node1 12180 1727204077.20411: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003fb 12180 1727204077.20415: WORKER PROCESS EXITING 12180 1727204077.21917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204077.23925: done with get_vars() 12180 1727204077.23951: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.085) 0:00:24.653 ***** 12180 1727204077.24167: entering _queue_task() for managed-node1/stat 12180 1727204077.24511: worker is 1 (out of 1 available) 12180 1727204077.24523: exiting _queue_task() for managed-node1/stat 12180 1727204077.24538: done queuing things up, now waiting for results queue to drain 12180 1727204077.24540: waiting for pending results... 12180 1727204077.25485: running TaskExecutor() for managed-node1/TASK: Stat profile file 12180 1727204077.25699: in run() - task 0affcd87-79f5-ccb1-55ae-0000000003fc 12180 1727204077.25711: variable 'ansible_search_path' from source: unknown 12180 1727204077.25714: variable 'ansible_search_path' from source: unknown 12180 1727204077.25863: calling self._execute() 12180 1727204077.26035: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204077.26040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204077.26048: variable 'omit' from source: magic vars 12180 1727204077.26561: variable 'ansible_distribution_major_version' from source: facts 12180 1727204077.26585: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204077.26597: variable 'omit' from source: magic vars 12180 1727204077.26663: variable 'omit' from source: magic vars 12180 1727204077.26783: variable 'profile' from source: include params 12180 1727204077.26794: variable 'item' from source: include params 12180 1727204077.26880: variable 'item' from source: include params 12180 1727204077.26904: variable 'omit' from source: magic vars 12180 1727204077.26967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204077.27010: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204077.27046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204077.27075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204077.27094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204077.27134: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204077.27147: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204077.27163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204077.27287: Set connection var ansible_pipelining to False 12180 1727204077.27296: Set connection var ansible_shell_type to sh 12180 1727204077.27307: Set connection var ansible_timeout to 10 12180 1727204077.27316: Set connection var ansible_connection to ssh 12180 1727204077.27326: Set connection var ansible_shell_executable to /bin/sh 12180 1727204077.27339: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204077.27374: variable 'ansible_shell_executable' from source: unknown 12180 1727204077.27385: variable 'ansible_connection' from source: unknown 12180 1727204077.27391: variable 'ansible_module_compression' from source: unknown 12180 1727204077.27397: variable 'ansible_shell_type' from source: unknown 12180 1727204077.27403: variable 'ansible_shell_executable' from source: unknown 12180 1727204077.27408: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204077.27414: variable 'ansible_pipelining' from source: unknown 12180 1727204077.27419: variable 'ansible_timeout' from source: unknown 12180 1727204077.27426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204077.27636: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204077.27650: variable 'omit' from source: magic vars 12180 1727204077.27660: starting attempt loop 12180 1727204077.27668: running the handler 12180 1727204077.27685: _low_level_execute_command(): starting 12180 1727204077.27703: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204077.28615: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204077.28633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204077.28649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204077.28673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.28726: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204077.28741: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204077.28754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.28774: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204077.28789: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204077.28812: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204077.28826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204077.28843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204077.28858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.28873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204077.28884: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204077.28903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.28995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204077.29022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204077.29047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204077.29146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204077.30802: stdout chunk (state=3): >>>/root <<< 12180 1727204077.31005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204077.31008: stdout chunk (state=3): >>><<< 12180 1727204077.31011: stderr chunk (state=3): >>><<< 12180 1727204077.31123: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204077.31127: _low_level_execute_command(): starting 12180 1727204077.31136: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204077.3103502-14816-103735671277344 `" && echo ansible-tmp-1727204077.3103502-14816-103735671277344="` echo /root/.ansible/tmp/ansible-tmp-1727204077.3103502-14816-103735671277344 `" ) && sleep 0' 12180 1727204077.31783: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204077.31808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204077.31825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204077.31847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.31896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204077.31909: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204077.31927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.31950: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204077.31963: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204077.31977: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204077.31993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204077.32008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204077.32024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.32039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204077.32050: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204077.32063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.32151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204077.32184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204077.32201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204077.32441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204077.34155: stdout chunk (state=3): >>>ansible-tmp-1727204077.3103502-14816-103735671277344=/root/.ansible/tmp/ansible-tmp-1727204077.3103502-14816-103735671277344 <<< 12180 1727204077.34282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204077.34349: stderr chunk (state=3): >>><<< 12180 1727204077.34351: stdout chunk (state=3): >>><<< 12180 1727204077.34370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204077.3103502-14816-103735671277344=/root/.ansible/tmp/ansible-tmp-1727204077.3103502-14816-103735671277344 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204077.34410: variable 'ansible_module_compression' from source: unknown 12180 1727204077.34463: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12180 1727204077.34492: variable 'ansible_facts' from source: unknown 12180 1727204077.34558: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204077.3103502-14816-103735671277344/AnsiballZ_stat.py 12180 1727204077.34674: Sending initial data 12180 1727204077.34677: Sent initial data (153 bytes) 12180 1727204077.35821: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204077.35825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204077.35884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204077.37569: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12180 1727204077.37584: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 12180 1727204077.37592: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 12180 1727204077.37599: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 12180 1727204077.37606: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 12180 1727204077.37613: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 12180 1727204077.37620: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 12180 1727204077.37633: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 12180 1727204077.37650: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204077.37715: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 12180 1727204077.37728: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 12180 1727204077.37743: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 12180 1727204077.37811: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmps_lh86zi /root/.ansible/tmp/ansible-tmp-1727204077.3103502-14816-103735671277344/AnsiballZ_stat.py <<< 12180 1727204077.37886: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204077.39084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204077.39186: stderr chunk (state=3): >>><<< 12180 1727204077.39191: stdout chunk (state=3): >>><<< 12180 1727204077.39270: done transferring module to remote 12180 1727204077.39274: _low_level_execute_command(): starting 12180 1727204077.39276: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204077.3103502-14816-103735671277344/ /root/.ansible/tmp/ansible-tmp-1727204077.3103502-14816-103735671277344/AnsiballZ_stat.py && sleep 0' 12180 1727204077.39894: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204077.39914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204077.39933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204077.39953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.39996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204077.40012: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204077.40033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.40054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204077.40067: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204077.40078: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204077.40089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204077.40102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204077.40117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.40131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204077.40143: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204077.40159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.40235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204077.40257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204077.40277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204077.40361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204077.42084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204077.42121: stderr chunk (state=3): >>><<< 12180 1727204077.42125: stdout chunk (state=3): >>><<< 12180 1727204077.42144: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204077.42148: _low_level_execute_command(): starting 12180 1727204077.42152: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204077.3103502-14816-103735671277344/AnsiballZ_stat.py && sleep 0' 12180 1727204077.42601: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204077.42604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.42637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.42640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204077.42643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.42701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204077.42708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204077.42770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204077.57808: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12180 1727204077.58778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204077.58840: stderr chunk (state=3): >>><<< 12180 1727204077.58844: stdout chunk (state=3): >>><<< 12180 1727204077.58860: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204077.58886: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204077.3103502-14816-103735671277344/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204077.58894: _low_level_execute_command(): starting 12180 1727204077.58898: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204077.3103502-14816-103735671277344/ > /dev/null 2>&1 && sleep 0' 12180 1727204077.59550: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204077.59559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204077.59571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204077.59586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.59631: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204077.59641: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204077.59657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.59673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204077.59680: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204077.59687: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204077.59694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204077.59703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204077.59716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.59725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204077.59734: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204077.59746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.59822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204077.59841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204077.59857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204077.59944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204077.61723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204077.61785: stderr chunk (state=3): >>><<< 12180 1727204077.61788: stdout chunk (state=3): >>><<< 12180 1727204077.61805: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204077.61813: handler run complete 12180 1727204077.61832: attempt loop complete, returning result 12180 1727204077.61835: _execute() done 12180 1727204077.61838: dumping result to json 12180 1727204077.61840: done dumping result, returning 12180 1727204077.61846: done running TaskExecutor() for managed-node1/TASK: Stat profile file [0affcd87-79f5-ccb1-55ae-0000000003fc] 12180 1727204077.61851: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003fc 12180 1727204077.61949: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003fc 12180 1727204077.61952: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 12180 1727204077.62031: no more pending results, returning what we have 12180 1727204077.62035: results queue empty 12180 1727204077.62036: checking for any_errors_fatal 12180 1727204077.62044: done checking for any_errors_fatal 12180 1727204077.62044: checking for max_fail_percentage 12180 1727204077.62046: done checking for max_fail_percentage 12180 1727204077.62047: checking to see if all hosts have failed and the running result is not ok 12180 1727204077.62048: done checking to see if all hosts have failed 12180 1727204077.62049: getting the remaining hosts for this loop 12180 1727204077.62050: done getting the remaining hosts for this loop 12180 1727204077.62053: getting the next task for host managed-node1 12180 1727204077.62060: done getting next task for host managed-node1 12180 1727204077.62064: ^ task is: TASK: Set NM profile exist flag based on the profile files 12180 1727204077.62068: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204077.62073: getting variables 12180 1727204077.62074: in VariableManager get_vars() 12180 1727204077.62150: Calling all_inventory to load vars for managed-node1 12180 1727204077.62153: Calling groups_inventory to load vars for managed-node1 12180 1727204077.62155: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204077.62168: Calling all_plugins_play to load vars for managed-node1 12180 1727204077.62170: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204077.62173: Calling groups_plugins_play to load vars for managed-node1 12180 1727204077.63651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204077.64819: done with get_vars() 12180 1727204077.64840: done getting variables 12180 1727204077.64890: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.407) 0:00:25.061 ***** 12180 1727204077.64914: entering _queue_task() for managed-node1/set_fact 12180 1727204077.65147: worker is 1 (out of 1 available) 12180 1727204077.65160: exiting _queue_task() for managed-node1/set_fact 12180 1727204077.65174: done queuing things up, now waiting for results queue to drain 12180 1727204077.65176: waiting for pending results... 12180 1727204077.65369: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 12180 1727204077.65455: in run() - task 0affcd87-79f5-ccb1-55ae-0000000003fd 12180 1727204077.65468: variable 'ansible_search_path' from source: unknown 12180 1727204077.65472: variable 'ansible_search_path' from source: unknown 12180 1727204077.65498: calling self._execute() 12180 1727204077.65572: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204077.65576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204077.65585: variable 'omit' from source: magic vars 12180 1727204077.65963: variable 'ansible_distribution_major_version' from source: facts 12180 1727204077.66378: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204077.66380: variable 'profile_stat' from source: set_fact 12180 1727204077.66383: Evaluated conditional (profile_stat.stat.exists): False 12180 1727204077.66385: when evaluation is False, skipping this task 12180 1727204077.66387: _execute() done 12180 1727204077.66389: dumping result to json 12180 1727204077.66391: done dumping result, returning 12180 1727204077.66394: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-ccb1-55ae-0000000003fd] 12180 1727204077.66395: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003fd 12180 1727204077.66458: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003fd 12180 1727204077.66461: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12180 1727204077.66508: no more pending results, returning what we have 12180 1727204077.66512: results queue empty 12180 1727204077.66513: checking for any_errors_fatal 12180 1727204077.66520: done checking for any_errors_fatal 12180 1727204077.66521: checking for max_fail_percentage 12180 1727204077.66523: done checking for max_fail_percentage 12180 1727204077.66523: checking to see if all hosts have failed and the running result is not ok 12180 1727204077.66524: done checking to see if all hosts have failed 12180 1727204077.66525: getting the remaining hosts for this loop 12180 1727204077.66526: done getting the remaining hosts for this loop 12180 1727204077.66529: getting the next task for host managed-node1 12180 1727204077.66536: done getting next task for host managed-node1 12180 1727204077.66538: ^ task is: TASK: Get NM profile info 12180 1727204077.66543: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204077.66546: getting variables 12180 1727204077.66547: in VariableManager get_vars() 12180 1727204077.66586: Calling all_inventory to load vars for managed-node1 12180 1727204077.66588: Calling groups_inventory to load vars for managed-node1 12180 1727204077.66591: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204077.66600: Calling all_plugins_play to load vars for managed-node1 12180 1727204077.66603: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204077.66606: Calling groups_plugins_play to load vars for managed-node1 12180 1727204077.71624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204077.72535: done with get_vars() 12180 1727204077.72556: done getting variables 12180 1727204077.72594: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:37 -0400 (0:00:00.076) 0:00:25.138 ***** 12180 1727204077.72615: entering _queue_task() for managed-node1/shell 12180 1727204077.72846: worker is 1 (out of 1 available) 12180 1727204077.72858: exiting _queue_task() for managed-node1/shell 12180 1727204077.72873: done queuing things up, now waiting for results queue to drain 12180 1727204077.72874: waiting for pending results... 12180 1727204077.73059: running TaskExecutor() for managed-node1/TASK: Get NM profile info 12180 1727204077.73144: in run() - task 0affcd87-79f5-ccb1-55ae-0000000003fe 12180 1727204077.73154: variable 'ansible_search_path' from source: unknown 12180 1727204077.73158: variable 'ansible_search_path' from source: unknown 12180 1727204077.73190: calling self._execute() 12180 1727204077.73268: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204077.73276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204077.73286: variable 'omit' from source: magic vars 12180 1727204077.73578: variable 'ansible_distribution_major_version' from source: facts 12180 1727204077.73589: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204077.73595: variable 'omit' from source: magic vars 12180 1727204077.73631: variable 'omit' from source: magic vars 12180 1727204077.73708: variable 'profile' from source: include params 12180 1727204077.73712: variable 'item' from source: include params 12180 1727204077.73763: variable 'item' from source: include params 12180 1727204077.73779: variable 'omit' from source: magic vars 12180 1727204077.73813: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204077.73843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204077.73862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204077.73877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204077.73888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204077.73911: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204077.73915: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204077.73918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204077.73992: Set connection var ansible_pipelining to False 12180 1727204077.73995: Set connection var ansible_shell_type to sh 12180 1727204077.74000: Set connection var ansible_timeout to 10 12180 1727204077.74005: Set connection var ansible_connection to ssh 12180 1727204077.74011: Set connection var ansible_shell_executable to /bin/sh 12180 1727204077.74016: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204077.74040: variable 'ansible_shell_executable' from source: unknown 12180 1727204077.74043: variable 'ansible_connection' from source: unknown 12180 1727204077.74045: variable 'ansible_module_compression' from source: unknown 12180 1727204077.74049: variable 'ansible_shell_type' from source: unknown 12180 1727204077.74052: variable 'ansible_shell_executable' from source: unknown 12180 1727204077.74054: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204077.74057: variable 'ansible_pipelining' from source: unknown 12180 1727204077.74059: variable 'ansible_timeout' from source: unknown 12180 1727204077.74061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204077.74165: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204077.74176: variable 'omit' from source: magic vars 12180 1727204077.74179: starting attempt loop 12180 1727204077.74182: running the handler 12180 1727204077.74191: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204077.74207: _low_level_execute_command(): starting 12180 1727204077.74214: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204077.74758: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204077.74769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.74798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.74812: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.74873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204077.74886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204077.74951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204077.76525: stdout chunk (state=3): >>>/root <<< 12180 1727204077.76626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204077.76694: stderr chunk (state=3): >>><<< 12180 1727204077.76698: stdout chunk (state=3): >>><<< 12180 1727204077.76719: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204077.76733: _low_level_execute_command(): starting 12180 1727204077.76737: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204077.7671936-14849-22981494884230 `" && echo ansible-tmp-1727204077.7671936-14849-22981494884230="` echo /root/.ansible/tmp/ansible-tmp-1727204077.7671936-14849-22981494884230 `" ) && sleep 0' 12180 1727204077.77217: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204077.77231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.77257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.77273: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.77326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204077.77349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204077.77356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204077.77408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204077.79260: stdout chunk (state=3): >>>ansible-tmp-1727204077.7671936-14849-22981494884230=/root/.ansible/tmp/ansible-tmp-1727204077.7671936-14849-22981494884230 <<< 12180 1727204077.79379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204077.79438: stderr chunk (state=3): >>><<< 12180 1727204077.79441: stdout chunk (state=3): >>><<< 12180 1727204077.79465: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204077.7671936-14849-22981494884230=/root/.ansible/tmp/ansible-tmp-1727204077.7671936-14849-22981494884230 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204077.79492: variable 'ansible_module_compression' from source: unknown 12180 1727204077.79542: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12180 1727204077.79576: variable 'ansible_facts' from source: unknown 12180 1727204077.79641: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204077.7671936-14849-22981494884230/AnsiballZ_command.py 12180 1727204077.79761: Sending initial data 12180 1727204077.79768: Sent initial data (155 bytes) 12180 1727204077.80457: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204077.80460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204077.80463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.80488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204077.80492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204077.80501: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204077.80507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204077.80517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.80522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.80590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204077.80597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204077.80667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204077.82378: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204077.82428: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204077.82486: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmp3tydia4u /root/.ansible/tmp/ansible-tmp-1727204077.7671936-14849-22981494884230/AnsiballZ_command.py <<< 12180 1727204077.82539: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204077.83384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204077.83495: stderr chunk (state=3): >>><<< 12180 1727204077.83499: stdout chunk (state=3): >>><<< 12180 1727204077.83516: done transferring module to remote 12180 1727204077.83525: _low_level_execute_command(): starting 12180 1727204077.83530: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204077.7671936-14849-22981494884230/ /root/.ansible/tmp/ansible-tmp-1727204077.7671936-14849-22981494884230/AnsiballZ_command.py && sleep 0' 12180 1727204077.83991: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204077.83998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.84053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204077.84057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12180 1727204077.84059: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.84067: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.84115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204077.84121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204077.84189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204077.85912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204077.86003: stderr chunk (state=3): >>><<< 12180 1727204077.86006: stdout chunk (state=3): >>><<< 12180 1727204077.86034: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204077.86037: _low_level_execute_command(): starting 12180 1727204077.86040: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204077.7671936-14849-22981494884230/AnsiballZ_command.py && sleep 0' 12180 1727204077.86685: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204077.86691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204077.86704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204077.86723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.86756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204077.86769: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204077.86774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.86798: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204077.86801: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204077.86804: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204077.86815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204077.86836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204077.86856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204077.86859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204077.86862: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204077.86871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204077.86936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204077.86949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204077.86952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204077.87010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204078.02461: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:54:38.001170", "end": "2024-09-24 14:54:38.024153", "delta": "0:00:00.022983", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12180 1727204078.03875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204078.03941: stderr chunk (state=3): >>><<< 12180 1727204078.03945: stdout chunk (state=3): >>><<< 12180 1727204078.04088: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:54:38.001170", "end": "2024-09-24 14:54:38.024153", "delta": "0:00:00.022983", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204078.04093: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204077.7671936-14849-22981494884230/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204078.04101: _low_level_execute_command(): starting 12180 1727204078.04103: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204077.7671936-14849-22981494884230/ > /dev/null 2>&1 && sleep 0' 12180 1727204078.05938: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204078.06062: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204078.06082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204078.06100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204078.06197: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204078.06209: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204078.06224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204078.06244: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204078.06272: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204078.06283: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204078.06295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204078.06308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204078.06322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204078.06380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204078.06392: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204078.06405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204078.06487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204078.06615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204078.06629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204078.06718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204078.08608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204078.08612: stdout chunk (state=3): >>><<< 12180 1727204078.08615: stderr chunk (state=3): >>><<< 12180 1727204078.08873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204078.08877: handler run complete 12180 1727204078.08880: Evaluated conditional (False): False 12180 1727204078.08882: attempt loop complete, returning result 12180 1727204078.08884: _execute() done 12180 1727204078.08886: dumping result to json 12180 1727204078.08888: done dumping result, returning 12180 1727204078.08890: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [0affcd87-79f5-ccb1-55ae-0000000003fe] 12180 1727204078.08892: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003fe 12180 1727204078.08967: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003fe 12180 1727204078.08971: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.022983", "end": "2024-09-24 14:54:38.024153", "rc": 0, "start": "2024-09-24 14:54:38.001170" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 12180 1727204078.09052: no more pending results, returning what we have 12180 1727204078.09056: results queue empty 12180 1727204078.09057: checking for any_errors_fatal 12180 1727204078.09066: done checking for any_errors_fatal 12180 1727204078.09067: checking for max_fail_percentage 12180 1727204078.09070: done checking for max_fail_percentage 12180 1727204078.09070: checking to see if all hosts have failed and the running result is not ok 12180 1727204078.09072: done checking to see if all hosts have failed 12180 1727204078.09072: getting the remaining hosts for this loop 12180 1727204078.09074: done getting the remaining hosts for this loop 12180 1727204078.09077: getting the next task for host managed-node1 12180 1727204078.09085: done getting next task for host managed-node1 12180 1727204078.09087: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12180 1727204078.09092: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204078.09097: getting variables 12180 1727204078.09100: in VariableManager get_vars() 12180 1727204078.09145: Calling all_inventory to load vars for managed-node1 12180 1727204078.09148: Calling groups_inventory to load vars for managed-node1 12180 1727204078.09151: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204078.09165: Calling all_plugins_play to load vars for managed-node1 12180 1727204078.09168: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204078.09171: Calling groups_plugins_play to load vars for managed-node1 12180 1727204078.11420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204078.15262: done with get_vars() 12180 1727204078.15633: done getting variables 12180 1727204078.15834: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.433) 0:00:25.571 ***** 12180 1727204078.15962: entering _queue_task() for managed-node1/set_fact 12180 1727204078.16996: worker is 1 (out of 1 available) 12180 1727204078.17008: exiting _queue_task() for managed-node1/set_fact 12180 1727204078.17022: done queuing things up, now waiting for results queue to drain 12180 1727204078.17023: waiting for pending results... 12180 1727204078.17673: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12180 1727204078.18017: in run() - task 0affcd87-79f5-ccb1-55ae-0000000003ff 12180 1727204078.18039: variable 'ansible_search_path' from source: unknown 12180 1727204078.18075: variable 'ansible_search_path' from source: unknown 12180 1727204078.18189: calling self._execute() 12180 1727204078.18411: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.18462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.18482: variable 'omit' from source: magic vars 12180 1727204078.19374: variable 'ansible_distribution_major_version' from source: facts 12180 1727204078.19394: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204078.19744: variable 'nm_profile_exists' from source: set_fact 12180 1727204078.19771: Evaluated conditional (nm_profile_exists.rc == 0): True 12180 1727204078.19783: variable 'omit' from source: magic vars 12180 1727204078.19837: variable 'omit' from source: magic vars 12180 1727204078.19877: variable 'omit' from source: magic vars 12180 1727204078.19929: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204078.19974: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204078.20005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204078.20032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204078.20048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204078.20085: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204078.20094: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.20106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.20222: Set connection var ansible_pipelining to False 12180 1727204078.20232: Set connection var ansible_shell_type to sh 12180 1727204078.20247: Set connection var ansible_timeout to 10 12180 1727204078.20258: Set connection var ansible_connection to ssh 12180 1727204078.20269: Set connection var ansible_shell_executable to /bin/sh 12180 1727204078.20277: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204078.20305: variable 'ansible_shell_executable' from source: unknown 12180 1727204078.20310: variable 'ansible_connection' from source: unknown 12180 1727204078.20320: variable 'ansible_module_compression' from source: unknown 12180 1727204078.20325: variable 'ansible_shell_type' from source: unknown 12180 1727204078.20332: variable 'ansible_shell_executable' from source: unknown 12180 1727204078.20337: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.20347: variable 'ansible_pipelining' from source: unknown 12180 1727204078.20352: variable 'ansible_timeout' from source: unknown 12180 1727204078.20359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.20507: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204078.20524: variable 'omit' from source: magic vars 12180 1727204078.20540: starting attempt loop 12180 1727204078.20548: running the handler 12180 1727204078.20570: handler run complete 12180 1727204078.20584: attempt loop complete, returning result 12180 1727204078.20589: _execute() done 12180 1727204078.20594: dumping result to json 12180 1727204078.20600: done dumping result, returning 12180 1727204078.20609: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-ccb1-55ae-0000000003ff] 12180 1727204078.20617: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003ff ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 12180 1727204078.20780: no more pending results, returning what we have 12180 1727204078.20784: results queue empty 12180 1727204078.20785: checking for any_errors_fatal 12180 1727204078.20795: done checking for any_errors_fatal 12180 1727204078.20796: checking for max_fail_percentage 12180 1727204078.20798: done checking for max_fail_percentage 12180 1727204078.20799: checking to see if all hosts have failed and the running result is not ok 12180 1727204078.20800: done checking to see if all hosts have failed 12180 1727204078.20801: getting the remaining hosts for this loop 12180 1727204078.20802: done getting the remaining hosts for this loop 12180 1727204078.20807: getting the next task for host managed-node1 12180 1727204078.20820: done getting next task for host managed-node1 12180 1727204078.20824: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 12180 1727204078.20829: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204078.20833: getting variables 12180 1727204078.20835: in VariableManager get_vars() 12180 1727204078.20885: Calling all_inventory to load vars for managed-node1 12180 1727204078.20888: Calling groups_inventory to load vars for managed-node1 12180 1727204078.20891: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204078.20903: Calling all_plugins_play to load vars for managed-node1 12180 1727204078.20906: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204078.20909: Calling groups_plugins_play to load vars for managed-node1 12180 1727204078.21952: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000003ff 12180 1727204078.21956: WORKER PROCESS EXITING 12180 1727204078.23658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204078.27428: done with get_vars() 12180 1727204078.27466: done getting variables 12180 1727204078.27533: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204078.27865: variable 'profile' from source: include params 12180 1727204078.27869: variable 'item' from source: include params 12180 1727204078.27932: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.120) 0:00:25.691 ***** 12180 1727204078.28197: entering _queue_task() for managed-node1/command 12180 1727204078.28716: worker is 1 (out of 1 available) 12180 1727204078.28728: exiting _queue_task() for managed-node1/command 12180 1727204078.28740: done queuing things up, now waiting for results queue to drain 12180 1727204078.28741: waiting for pending results... 12180 1727204078.29026: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 12180 1727204078.29160: in run() - task 0affcd87-79f5-ccb1-55ae-000000000401 12180 1727204078.29188: variable 'ansible_search_path' from source: unknown 12180 1727204078.29196: variable 'ansible_search_path' from source: unknown 12180 1727204078.29239: calling self._execute() 12180 1727204078.29348: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.29359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.29378: variable 'omit' from source: magic vars 12180 1727204078.29754: variable 'ansible_distribution_major_version' from source: facts 12180 1727204078.29777: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204078.29916: variable 'profile_stat' from source: set_fact 12180 1727204078.29941: Evaluated conditional (profile_stat.stat.exists): False 12180 1727204078.29949: when evaluation is False, skipping this task 12180 1727204078.29956: _execute() done 12180 1727204078.29963: dumping result to json 12180 1727204078.29974: done dumping result, returning 12180 1727204078.29984: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0affcd87-79f5-ccb1-55ae-000000000401] 12180 1727204078.29995: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000401 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12180 1727204078.30159: no more pending results, returning what we have 12180 1727204078.30167: results queue empty 12180 1727204078.30171: checking for any_errors_fatal 12180 1727204078.30181: done checking for any_errors_fatal 12180 1727204078.30182: checking for max_fail_percentage 12180 1727204078.30185: done checking for max_fail_percentage 12180 1727204078.30185: checking to see if all hosts have failed and the running result is not ok 12180 1727204078.30187: done checking to see if all hosts have failed 12180 1727204078.30187: getting the remaining hosts for this loop 12180 1727204078.30189: done getting the remaining hosts for this loop 12180 1727204078.30195: getting the next task for host managed-node1 12180 1727204078.30204: done getting next task for host managed-node1 12180 1727204078.30207: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 12180 1727204078.30211: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204078.30215: getting variables 12180 1727204078.30217: in VariableManager get_vars() 12180 1727204078.30263: Calling all_inventory to load vars for managed-node1 12180 1727204078.30267: Calling groups_inventory to load vars for managed-node1 12180 1727204078.30270: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204078.30296: Calling all_plugins_play to load vars for managed-node1 12180 1727204078.30299: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204078.30302: Calling groups_plugins_play to load vars for managed-node1 12180 1727204078.31287: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000401 12180 1727204078.31291: WORKER PROCESS EXITING 12180 1727204078.32825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204078.34577: done with get_vars() 12180 1727204078.34606: done getting variables 12180 1727204078.34668: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204078.34811: variable 'profile' from source: include params 12180 1727204078.34818: variable 'item' from source: include params 12180 1727204078.34994: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.070) 0:00:25.762 ***** 12180 1727204078.35035: entering _queue_task() for managed-node1/set_fact 12180 1727204078.35379: worker is 1 (out of 1 available) 12180 1727204078.35390: exiting _queue_task() for managed-node1/set_fact 12180 1727204078.35403: done queuing things up, now waiting for results queue to drain 12180 1727204078.35405: waiting for pending results... 12180 1727204078.35690: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 12180 1727204078.35827: in run() - task 0affcd87-79f5-ccb1-55ae-000000000402 12180 1727204078.35851: variable 'ansible_search_path' from source: unknown 12180 1727204078.35859: variable 'ansible_search_path' from source: unknown 12180 1727204078.35899: calling self._execute() 12180 1727204078.36002: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.36016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.36032: variable 'omit' from source: magic vars 12180 1727204078.36420: variable 'ansible_distribution_major_version' from source: facts 12180 1727204078.36439: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204078.36573: variable 'profile_stat' from source: set_fact 12180 1727204078.36595: Evaluated conditional (profile_stat.stat.exists): False 12180 1727204078.36607: when evaluation is False, skipping this task 12180 1727204078.36616: _execute() done 12180 1727204078.36623: dumping result to json 12180 1727204078.36630: done dumping result, returning 12180 1727204078.36641: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0affcd87-79f5-ccb1-55ae-000000000402] 12180 1727204078.36651: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000402 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12180 1727204078.36801: no more pending results, returning what we have 12180 1727204078.36806: results queue empty 12180 1727204078.36808: checking for any_errors_fatal 12180 1727204078.36815: done checking for any_errors_fatal 12180 1727204078.36816: checking for max_fail_percentage 12180 1727204078.36818: done checking for max_fail_percentage 12180 1727204078.36819: checking to see if all hosts have failed and the running result is not ok 12180 1727204078.36820: done checking to see if all hosts have failed 12180 1727204078.36821: getting the remaining hosts for this loop 12180 1727204078.36822: done getting the remaining hosts for this loop 12180 1727204078.36826: getting the next task for host managed-node1 12180 1727204078.36835: done getting next task for host managed-node1 12180 1727204078.36837: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 12180 1727204078.36842: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204078.36848: getting variables 12180 1727204078.36850: in VariableManager get_vars() 12180 1727204078.36899: Calling all_inventory to load vars for managed-node1 12180 1727204078.36902: Calling groups_inventory to load vars for managed-node1 12180 1727204078.36905: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204078.36920: Calling all_plugins_play to load vars for managed-node1 12180 1727204078.36923: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204078.36927: Calling groups_plugins_play to load vars for managed-node1 12180 1727204078.37986: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000402 12180 1727204078.37990: WORKER PROCESS EXITING 12180 1727204078.38770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204078.40393: done with get_vars() 12180 1727204078.40419: done getting variables 12180 1727204078.40484: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204078.40601: variable 'profile' from source: include params 12180 1727204078.40605: variable 'item' from source: include params 12180 1727204078.40667: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.056) 0:00:25.818 ***** 12180 1727204078.40699: entering _queue_task() for managed-node1/command 12180 1727204078.41012: worker is 1 (out of 1 available) 12180 1727204078.41024: exiting _queue_task() for managed-node1/command 12180 1727204078.41036: done queuing things up, now waiting for results queue to drain 12180 1727204078.41038: waiting for pending results... 12180 1727204078.41323: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 12180 1727204078.41454: in run() - task 0affcd87-79f5-ccb1-55ae-000000000403 12180 1727204078.41478: variable 'ansible_search_path' from source: unknown 12180 1727204078.41488: variable 'ansible_search_path' from source: unknown 12180 1727204078.41529: calling self._execute() 12180 1727204078.41630: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.41642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.41655: variable 'omit' from source: magic vars 12180 1727204078.42107: variable 'ansible_distribution_major_version' from source: facts 12180 1727204078.42126: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204078.42255: variable 'profile_stat' from source: set_fact 12180 1727204078.42285: Evaluated conditional (profile_stat.stat.exists): False 12180 1727204078.42376: when evaluation is False, skipping this task 12180 1727204078.42383: _execute() done 12180 1727204078.42390: dumping result to json 12180 1727204078.42397: done dumping result, returning 12180 1727204078.42408: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0affcd87-79f5-ccb1-55ae-000000000403] 12180 1727204078.42419: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000403 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12180 1727204078.42577: no more pending results, returning what we have 12180 1727204078.42582: results queue empty 12180 1727204078.42583: checking for any_errors_fatal 12180 1727204078.42591: done checking for any_errors_fatal 12180 1727204078.42592: checking for max_fail_percentage 12180 1727204078.42594: done checking for max_fail_percentage 12180 1727204078.42595: checking to see if all hosts have failed and the running result is not ok 12180 1727204078.42596: done checking to see if all hosts have failed 12180 1727204078.42597: getting the remaining hosts for this loop 12180 1727204078.42598: done getting the remaining hosts for this loop 12180 1727204078.42602: getting the next task for host managed-node1 12180 1727204078.42610: done getting next task for host managed-node1 12180 1727204078.42613: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 12180 1727204078.42618: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204078.42623: getting variables 12180 1727204078.42625: in VariableManager get_vars() 12180 1727204078.42672: Calling all_inventory to load vars for managed-node1 12180 1727204078.42675: Calling groups_inventory to load vars for managed-node1 12180 1727204078.42678: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204078.42691: Calling all_plugins_play to load vars for managed-node1 12180 1727204078.42694: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204078.42697: Calling groups_plugins_play to load vars for managed-node1 12180 1727204078.44173: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000403 12180 1727204078.44178: WORKER PROCESS EXITING 12180 1727204078.45108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204078.47782: done with get_vars() 12180 1727204078.47808: done getting variables 12180 1727204078.47872: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204078.47991: variable 'profile' from source: include params 12180 1727204078.47995: variable 'item' from source: include params 12180 1727204078.48055: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.073) 0:00:25.892 ***** 12180 1727204078.48089: entering _queue_task() for managed-node1/set_fact 12180 1727204078.48403: worker is 1 (out of 1 available) 12180 1727204078.48413: exiting _queue_task() for managed-node1/set_fact 12180 1727204078.48425: done queuing things up, now waiting for results queue to drain 12180 1727204078.48427: waiting for pending results... 12180 1727204078.48709: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 12180 1727204078.48850: in run() - task 0affcd87-79f5-ccb1-55ae-000000000404 12180 1727204078.48875: variable 'ansible_search_path' from source: unknown 12180 1727204078.48883: variable 'ansible_search_path' from source: unknown 12180 1727204078.48923: calling self._execute() 12180 1727204078.49022: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.49034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.49047: variable 'omit' from source: magic vars 12180 1727204078.49403: variable 'ansible_distribution_major_version' from source: facts 12180 1727204078.49484: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204078.49758: variable 'profile_stat' from source: set_fact 12180 1727204078.49781: Evaluated conditional (profile_stat.stat.exists): False 12180 1727204078.49854: when evaluation is False, skipping this task 12180 1727204078.49862: _execute() done 12180 1727204078.49872: dumping result to json 12180 1727204078.49879: done dumping result, returning 12180 1727204078.49887: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0affcd87-79f5-ccb1-55ae-000000000404] 12180 1727204078.49896: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000404 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12180 1727204078.50045: no more pending results, returning what we have 12180 1727204078.50049: results queue empty 12180 1727204078.50050: checking for any_errors_fatal 12180 1727204078.50057: done checking for any_errors_fatal 12180 1727204078.50058: checking for max_fail_percentage 12180 1727204078.50060: done checking for max_fail_percentage 12180 1727204078.50060: checking to see if all hosts have failed and the running result is not ok 12180 1727204078.50062: done checking to see if all hosts have failed 12180 1727204078.50063: getting the remaining hosts for this loop 12180 1727204078.50066: done getting the remaining hosts for this loop 12180 1727204078.50070: getting the next task for host managed-node1 12180 1727204078.50080: done getting next task for host managed-node1 12180 1727204078.50083: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 12180 1727204078.50086: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204078.50090: getting variables 12180 1727204078.50092: in VariableManager get_vars() 12180 1727204078.50135: Calling all_inventory to load vars for managed-node1 12180 1727204078.50138: Calling groups_inventory to load vars for managed-node1 12180 1727204078.50141: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204078.50154: Calling all_plugins_play to load vars for managed-node1 12180 1727204078.50157: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204078.50160: Calling groups_plugins_play to load vars for managed-node1 12180 1727204078.51384: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000404 12180 1727204078.51388: WORKER PROCESS EXITING 12180 1727204078.52355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204078.54157: done with get_vars() 12180 1727204078.54187: done getting variables 12180 1727204078.54249: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204078.54371: variable 'profile' from source: include params 12180 1727204078.54375: variable 'item' from source: include params 12180 1727204078.54436: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.063) 0:00:25.956 ***** 12180 1727204078.54469: entering _queue_task() for managed-node1/assert 12180 1727204078.54785: worker is 1 (out of 1 available) 12180 1727204078.54798: exiting _queue_task() for managed-node1/assert 12180 1727204078.54812: done queuing things up, now waiting for results queue to drain 12180 1727204078.54814: waiting for pending results... 12180 1727204078.55100: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.0' 12180 1727204078.55225: in run() - task 0affcd87-79f5-ccb1-55ae-000000000268 12180 1727204078.55246: variable 'ansible_search_path' from source: unknown 12180 1727204078.55259: variable 'ansible_search_path' from source: unknown 12180 1727204078.55300: calling self._execute() 12180 1727204078.55401: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.55413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.55428: variable 'omit' from source: magic vars 12180 1727204078.55788: variable 'ansible_distribution_major_version' from source: facts 12180 1727204078.55812: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204078.55824: variable 'omit' from source: magic vars 12180 1727204078.55870: variable 'omit' from source: magic vars 12180 1727204078.55978: variable 'profile' from source: include params 12180 1727204078.55988: variable 'item' from source: include params 12180 1727204078.56059: variable 'item' from source: include params 12180 1727204078.56085: variable 'omit' from source: magic vars 12180 1727204078.56136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204078.56177: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204078.56257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204078.56282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204078.56297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204078.56374: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204078.56457: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.56468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.56688: Set connection var ansible_pipelining to False 12180 1727204078.56696: Set connection var ansible_shell_type to sh 12180 1727204078.56706: Set connection var ansible_timeout to 10 12180 1727204078.56714: Set connection var ansible_connection to ssh 12180 1727204078.56722: Set connection var ansible_shell_executable to /bin/sh 12180 1727204078.56730: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204078.56759: variable 'ansible_shell_executable' from source: unknown 12180 1727204078.56883: variable 'ansible_connection' from source: unknown 12180 1727204078.56892: variable 'ansible_module_compression' from source: unknown 12180 1727204078.56899: variable 'ansible_shell_type' from source: unknown 12180 1727204078.56907: variable 'ansible_shell_executable' from source: unknown 12180 1727204078.56913: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.56921: variable 'ansible_pipelining' from source: unknown 12180 1727204078.56927: variable 'ansible_timeout' from source: unknown 12180 1727204078.56934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.57076: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204078.57222: variable 'omit' from source: magic vars 12180 1727204078.57234: starting attempt loop 12180 1727204078.57242: running the handler 12180 1727204078.57480: variable 'lsr_net_profile_exists' from source: set_fact 12180 1727204078.57491: Evaluated conditional (lsr_net_profile_exists): True 12180 1727204078.57503: handler run complete 12180 1727204078.57523: attempt loop complete, returning result 12180 1727204078.57647: _execute() done 12180 1727204078.57655: dumping result to json 12180 1727204078.57667: done dumping result, returning 12180 1727204078.57679: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.0' [0affcd87-79f5-ccb1-55ae-000000000268] 12180 1727204078.57690: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000268 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12180 1727204078.57843: no more pending results, returning what we have 12180 1727204078.57847: results queue empty 12180 1727204078.57848: checking for any_errors_fatal 12180 1727204078.57855: done checking for any_errors_fatal 12180 1727204078.57856: checking for max_fail_percentage 12180 1727204078.57858: done checking for max_fail_percentage 12180 1727204078.57859: checking to see if all hosts have failed and the running result is not ok 12180 1727204078.57860: done checking to see if all hosts have failed 12180 1727204078.57861: getting the remaining hosts for this loop 12180 1727204078.57862: done getting the remaining hosts for this loop 12180 1727204078.57868: getting the next task for host managed-node1 12180 1727204078.57875: done getting next task for host managed-node1 12180 1727204078.57878: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 12180 1727204078.57882: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204078.57887: getting variables 12180 1727204078.57889: in VariableManager get_vars() 12180 1727204078.57935: Calling all_inventory to load vars for managed-node1 12180 1727204078.57938: Calling groups_inventory to load vars for managed-node1 12180 1727204078.57941: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204078.57953: Calling all_plugins_play to load vars for managed-node1 12180 1727204078.57955: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204078.57958: Calling groups_plugins_play to load vars for managed-node1 12180 1727204078.58984: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000268 12180 1727204078.58988: WORKER PROCESS EXITING 12180 1727204078.60809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204078.64548: done with get_vars() 12180 1727204078.64584: done getting variables 12180 1727204078.64681: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204078.64824: variable 'profile' from source: include params 12180 1727204078.64828: variable 'item' from source: include params 12180 1727204078.64915: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.104) 0:00:26.061 ***** 12180 1727204078.64954: entering _queue_task() for managed-node1/assert 12180 1727204078.65384: worker is 1 (out of 1 available) 12180 1727204078.65407: exiting _queue_task() for managed-node1/assert 12180 1727204078.65419: done queuing things up, now waiting for results queue to drain 12180 1727204078.65421: waiting for pending results... 12180 1727204078.65789: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' 12180 1727204078.65917: in run() - task 0affcd87-79f5-ccb1-55ae-000000000269 12180 1727204078.65931: variable 'ansible_search_path' from source: unknown 12180 1727204078.65935: variable 'ansible_search_path' from source: unknown 12180 1727204078.65985: calling self._execute() 12180 1727204078.67153: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.67158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.67352: variable 'omit' from source: magic vars 12180 1727204078.68157: variable 'ansible_distribution_major_version' from source: facts 12180 1727204078.68180: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204078.68203: variable 'omit' from source: magic vars 12180 1727204078.68271: variable 'omit' from source: magic vars 12180 1727204078.68379: variable 'profile' from source: include params 12180 1727204078.68388: variable 'item' from source: include params 12180 1727204078.68452: variable 'item' from source: include params 12180 1727204078.68477: variable 'omit' from source: magic vars 12180 1727204078.68522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204078.68566: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204078.68593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204078.69182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204078.69199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204078.69235: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204078.69243: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.69251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.69356: Set connection var ansible_pipelining to False 12180 1727204078.69366: Set connection var ansible_shell_type to sh 12180 1727204078.69377: Set connection var ansible_timeout to 10 12180 1727204078.69385: Set connection var ansible_connection to ssh 12180 1727204078.69393: Set connection var ansible_shell_executable to /bin/sh 12180 1727204078.69401: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204078.69435: variable 'ansible_shell_executable' from source: unknown 12180 1727204078.69444: variable 'ansible_connection' from source: unknown 12180 1727204078.69450: variable 'ansible_module_compression' from source: unknown 12180 1727204078.69456: variable 'ansible_shell_type' from source: unknown 12180 1727204078.69462: variable 'ansible_shell_executable' from source: unknown 12180 1727204078.69470: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.69477: variable 'ansible_pipelining' from source: unknown 12180 1727204078.69483: variable 'ansible_timeout' from source: unknown 12180 1727204078.69490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.69638: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204078.69657: variable 'omit' from source: magic vars 12180 1727204078.69673: starting attempt loop 12180 1727204078.69682: running the handler 12180 1727204078.69816: variable 'lsr_net_profile_ansible_managed' from source: set_fact 12180 1727204078.69885: Evaluated conditional (lsr_net_profile_ansible_managed): True 12180 1727204078.69895: handler run complete 12180 1727204078.69913: attempt loop complete, returning result 12180 1727204078.69945: _execute() done 12180 1727204078.69952: dumping result to json 12180 1727204078.69962: done dumping result, returning 12180 1727204078.69993: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0affcd87-79f5-ccb1-55ae-000000000269] 12180 1727204078.70073: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000269 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12180 1727204078.70228: no more pending results, returning what we have 12180 1727204078.70234: results queue empty 12180 1727204078.70235: checking for any_errors_fatal 12180 1727204078.70242: done checking for any_errors_fatal 12180 1727204078.70243: checking for max_fail_percentage 12180 1727204078.70245: done checking for max_fail_percentage 12180 1727204078.70245: checking to see if all hosts have failed and the running result is not ok 12180 1727204078.70246: done checking to see if all hosts have failed 12180 1727204078.70247: getting the remaining hosts for this loop 12180 1727204078.70248: done getting the remaining hosts for this loop 12180 1727204078.70252: getting the next task for host managed-node1 12180 1727204078.70258: done getting next task for host managed-node1 12180 1727204078.70261: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 12180 1727204078.70267: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204078.70270: getting variables 12180 1727204078.70272: in VariableManager get_vars() 12180 1727204078.70315: Calling all_inventory to load vars for managed-node1 12180 1727204078.70318: Calling groups_inventory to load vars for managed-node1 12180 1727204078.70320: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204078.70373: Calling all_plugins_play to load vars for managed-node1 12180 1727204078.70377: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204078.70405: Calling groups_plugins_play to load vars for managed-node1 12180 1727204078.70929: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000269 12180 1727204078.70933: WORKER PROCESS EXITING 12180 1727204078.71828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204078.73713: done with get_vars() 12180 1727204078.73751: done getting variables 12180 1727204078.73856: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204078.74019: variable 'profile' from source: include params 12180 1727204078.74023: variable 'item' from source: include params 12180 1727204078.74119: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.092) 0:00:26.153 ***** 12180 1727204078.74172: entering _queue_task() for managed-node1/assert 12180 1727204078.74603: worker is 1 (out of 1 available) 12180 1727204078.74615: exiting _queue_task() for managed-node1/assert 12180 1727204078.74627: done queuing things up, now waiting for results queue to drain 12180 1727204078.74632: waiting for pending results... 12180 1727204078.74965: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.0 12180 1727204078.75073: in run() - task 0affcd87-79f5-ccb1-55ae-00000000026a 12180 1727204078.75091: variable 'ansible_search_path' from source: unknown 12180 1727204078.75095: variable 'ansible_search_path' from source: unknown 12180 1727204078.75146: calling self._execute() 12180 1727204078.75255: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.75258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.75271: variable 'omit' from source: magic vars 12180 1727204078.75695: variable 'ansible_distribution_major_version' from source: facts 12180 1727204078.75708: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204078.75714: variable 'omit' from source: magic vars 12180 1727204078.75761: variable 'omit' from source: magic vars 12180 1727204078.75877: variable 'profile' from source: include params 12180 1727204078.75881: variable 'item' from source: include params 12180 1727204078.75953: variable 'item' from source: include params 12180 1727204078.75978: variable 'omit' from source: magic vars 12180 1727204078.76027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204078.76062: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204078.76086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204078.76102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204078.76118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204078.76149: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204078.76152: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.76154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.76260: Set connection var ansible_pipelining to False 12180 1727204078.76265: Set connection var ansible_shell_type to sh 12180 1727204078.76270: Set connection var ansible_timeout to 10 12180 1727204078.76276: Set connection var ansible_connection to ssh 12180 1727204078.76281: Set connection var ansible_shell_executable to /bin/sh 12180 1727204078.76286: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204078.76316: variable 'ansible_shell_executable' from source: unknown 12180 1727204078.76326: variable 'ansible_connection' from source: unknown 12180 1727204078.76328: variable 'ansible_module_compression' from source: unknown 12180 1727204078.76334: variable 'ansible_shell_type' from source: unknown 12180 1727204078.76337: variable 'ansible_shell_executable' from source: unknown 12180 1727204078.76339: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.76343: variable 'ansible_pipelining' from source: unknown 12180 1727204078.76346: variable 'ansible_timeout' from source: unknown 12180 1727204078.76349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.76491: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204078.76502: variable 'omit' from source: magic vars 12180 1727204078.76511: starting attempt loop 12180 1727204078.76514: running the handler 12180 1727204078.76637: variable 'lsr_net_profile_fingerprint' from source: set_fact 12180 1727204078.76641: Evaluated conditional (lsr_net_profile_fingerprint): True 12180 1727204078.76655: handler run complete 12180 1727204078.76670: attempt loop complete, returning result 12180 1727204078.76673: _execute() done 12180 1727204078.76676: dumping result to json 12180 1727204078.76678: done dumping result, returning 12180 1727204078.76686: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.0 [0affcd87-79f5-ccb1-55ae-00000000026a] 12180 1727204078.76691: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000026a ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12180 1727204078.76825: no more pending results, returning what we have 12180 1727204078.76832: results queue empty 12180 1727204078.76833: checking for any_errors_fatal 12180 1727204078.76839: done checking for any_errors_fatal 12180 1727204078.76840: checking for max_fail_percentage 12180 1727204078.76842: done checking for max_fail_percentage 12180 1727204078.76843: checking to see if all hosts have failed and the running result is not ok 12180 1727204078.76844: done checking to see if all hosts have failed 12180 1727204078.76845: getting the remaining hosts for this loop 12180 1727204078.76846: done getting the remaining hosts for this loop 12180 1727204078.76850: getting the next task for host managed-node1 12180 1727204078.76861: done getting next task for host managed-node1 12180 1727204078.76865: ^ task is: TASK: Include the task 'get_profile_stat.yml' 12180 1727204078.76868: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204078.76875: getting variables 12180 1727204078.76876: in VariableManager get_vars() 12180 1727204078.76921: Calling all_inventory to load vars for managed-node1 12180 1727204078.76924: Calling groups_inventory to load vars for managed-node1 12180 1727204078.76927: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204078.76942: Calling all_plugins_play to load vars for managed-node1 12180 1727204078.76945: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204078.76948: Calling groups_plugins_play to load vars for managed-node1 12180 1727204078.77467: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000026a 12180 1727204078.77470: WORKER PROCESS EXITING 12180 1727204078.79156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204078.80947: done with get_vars() 12180 1727204078.80976: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.069) 0:00:26.222 ***** 12180 1727204078.81080: entering _queue_task() for managed-node1/include_tasks 12180 1727204078.81411: worker is 1 (out of 1 available) 12180 1727204078.81423: exiting _queue_task() for managed-node1/include_tasks 12180 1727204078.81440: done queuing things up, now waiting for results queue to drain 12180 1727204078.81442: waiting for pending results... 12180 1727204078.81734: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 12180 1727204078.81862: in run() - task 0affcd87-79f5-ccb1-55ae-00000000026e 12180 1727204078.81874: variable 'ansible_search_path' from source: unknown 12180 1727204078.81878: variable 'ansible_search_path' from source: unknown 12180 1727204078.81929: calling self._execute() 12180 1727204078.82041: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.82045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.82054: variable 'omit' from source: magic vars 12180 1727204078.82474: variable 'ansible_distribution_major_version' from source: facts 12180 1727204078.82494: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204078.82498: _execute() done 12180 1727204078.82501: dumping result to json 12180 1727204078.82504: done dumping result, returning 12180 1727204078.82506: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-ccb1-55ae-00000000026e] 12180 1727204078.82509: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000026e 12180 1727204078.82611: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000026e 12180 1727204078.82614: WORKER PROCESS EXITING 12180 1727204078.82672: no more pending results, returning what we have 12180 1727204078.82680: in VariableManager get_vars() 12180 1727204078.82735: Calling all_inventory to load vars for managed-node1 12180 1727204078.82738: Calling groups_inventory to load vars for managed-node1 12180 1727204078.82741: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204078.82756: Calling all_plugins_play to load vars for managed-node1 12180 1727204078.82759: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204078.82763: Calling groups_plugins_play to load vars for managed-node1 12180 1727204078.84450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204078.86382: done with get_vars() 12180 1727204078.86403: variable 'ansible_search_path' from source: unknown 12180 1727204078.86404: variable 'ansible_search_path' from source: unknown 12180 1727204078.86452: we have included files to process 12180 1727204078.86454: generating all_blocks data 12180 1727204078.86456: done generating all_blocks data 12180 1727204078.86462: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12180 1727204078.86463: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12180 1727204078.86468: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12180 1727204078.87467: done processing included file 12180 1727204078.87470: iterating over new_blocks loaded from include file 12180 1727204078.87472: in VariableManager get_vars() 12180 1727204078.87494: done with get_vars() 12180 1727204078.87497: filtering new block on tags 12180 1727204078.87531: done filtering new block on tags 12180 1727204078.87535: in VariableManager get_vars() 12180 1727204078.87556: done with get_vars() 12180 1727204078.87558: filtering new block on tags 12180 1727204078.87584: done filtering new block on tags 12180 1727204078.87586: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 12180 1727204078.87592: extending task lists for all hosts with included blocks 12180 1727204078.87796: done extending task lists 12180 1727204078.87797: done processing included files 12180 1727204078.87798: results queue empty 12180 1727204078.87799: checking for any_errors_fatal 12180 1727204078.87802: done checking for any_errors_fatal 12180 1727204078.87803: checking for max_fail_percentage 12180 1727204078.87804: done checking for max_fail_percentage 12180 1727204078.87804: checking to see if all hosts have failed and the running result is not ok 12180 1727204078.87805: done checking to see if all hosts have failed 12180 1727204078.87806: getting the remaining hosts for this loop 12180 1727204078.87807: done getting the remaining hosts for this loop 12180 1727204078.87809: getting the next task for host managed-node1 12180 1727204078.87814: done getting next task for host managed-node1 12180 1727204078.87816: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 12180 1727204078.87819: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204078.87821: getting variables 12180 1727204078.87822: in VariableManager get_vars() 12180 1727204078.87838: Calling all_inventory to load vars for managed-node1 12180 1727204078.87840: Calling groups_inventory to load vars for managed-node1 12180 1727204078.87842: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204078.87852: Calling all_plugins_play to load vars for managed-node1 12180 1727204078.87855: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204078.87858: Calling groups_plugins_play to load vars for managed-node1 12180 1727204078.89162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204078.90861: done with get_vars() 12180 1727204078.90892: done getting variables 12180 1727204078.90937: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.098) 0:00:26.321 ***** 12180 1727204078.90970: entering _queue_task() for managed-node1/set_fact 12180 1727204078.91306: worker is 1 (out of 1 available) 12180 1727204078.91322: exiting _queue_task() for managed-node1/set_fact 12180 1727204078.91337: done queuing things up, now waiting for results queue to drain 12180 1727204078.91339: waiting for pending results... 12180 1727204078.91628: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 12180 1727204078.91734: in run() - task 0affcd87-79f5-ccb1-55ae-000000000443 12180 1727204078.91749: variable 'ansible_search_path' from source: unknown 12180 1727204078.91752: variable 'ansible_search_path' from source: unknown 12180 1727204078.91796: calling self._execute() 12180 1727204078.91896: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.91900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.91911: variable 'omit' from source: magic vars 12180 1727204078.92287: variable 'ansible_distribution_major_version' from source: facts 12180 1727204078.92299: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204078.92309: variable 'omit' from source: magic vars 12180 1727204078.92362: variable 'omit' from source: magic vars 12180 1727204078.92398: variable 'omit' from source: magic vars 12180 1727204078.92449: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204078.92485: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204078.92505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204078.92526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204078.92543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204078.92575: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204078.92578: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.92581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.92690: Set connection var ansible_pipelining to False 12180 1727204078.92694: Set connection var ansible_shell_type to sh 12180 1727204078.92697: Set connection var ansible_timeout to 10 12180 1727204078.92703: Set connection var ansible_connection to ssh 12180 1727204078.92709: Set connection var ansible_shell_executable to /bin/sh 12180 1727204078.92714: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204078.92747: variable 'ansible_shell_executable' from source: unknown 12180 1727204078.92751: variable 'ansible_connection' from source: unknown 12180 1727204078.92760: variable 'ansible_module_compression' from source: unknown 12180 1727204078.92764: variable 'ansible_shell_type' from source: unknown 12180 1727204078.92769: variable 'ansible_shell_executable' from source: unknown 12180 1727204078.92771: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.92774: variable 'ansible_pipelining' from source: unknown 12180 1727204078.92776: variable 'ansible_timeout' from source: unknown 12180 1727204078.92781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.92924: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204078.92937: variable 'omit' from source: magic vars 12180 1727204078.92943: starting attempt loop 12180 1727204078.92946: running the handler 12180 1727204078.92966: handler run complete 12180 1727204078.92976: attempt loop complete, returning result 12180 1727204078.92983: _execute() done 12180 1727204078.92986: dumping result to json 12180 1727204078.92989: done dumping result, returning 12180 1727204078.92997: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-ccb1-55ae-000000000443] 12180 1727204078.93002: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000443 12180 1727204078.93086: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000443 12180 1727204078.93089: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 12180 1727204078.93148: no more pending results, returning what we have 12180 1727204078.93152: results queue empty 12180 1727204078.93153: checking for any_errors_fatal 12180 1727204078.93156: done checking for any_errors_fatal 12180 1727204078.93157: checking for max_fail_percentage 12180 1727204078.93159: done checking for max_fail_percentage 12180 1727204078.93159: checking to see if all hosts have failed and the running result is not ok 12180 1727204078.93160: done checking to see if all hosts have failed 12180 1727204078.93161: getting the remaining hosts for this loop 12180 1727204078.93162: done getting the remaining hosts for this loop 12180 1727204078.93168: getting the next task for host managed-node1 12180 1727204078.93177: done getting next task for host managed-node1 12180 1727204078.93179: ^ task is: TASK: Stat profile file 12180 1727204078.93183: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204078.93188: getting variables 12180 1727204078.93190: in VariableManager get_vars() 12180 1727204078.93237: Calling all_inventory to load vars for managed-node1 12180 1727204078.93241: Calling groups_inventory to load vars for managed-node1 12180 1727204078.93244: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204078.93256: Calling all_plugins_play to load vars for managed-node1 12180 1727204078.93258: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204078.93261: Calling groups_plugins_play to load vars for managed-node1 12180 1727204078.95125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204078.96880: done with get_vars() 12180 1727204078.96911: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:38 -0400 (0:00:00.060) 0:00:26.382 ***** 12180 1727204078.97009: entering _queue_task() for managed-node1/stat 12180 1727204078.97344: worker is 1 (out of 1 available) 12180 1727204078.97357: exiting _queue_task() for managed-node1/stat 12180 1727204078.97371: done queuing things up, now waiting for results queue to drain 12180 1727204078.97373: waiting for pending results... 12180 1727204078.97665: running TaskExecutor() for managed-node1/TASK: Stat profile file 12180 1727204078.97780: in run() - task 0affcd87-79f5-ccb1-55ae-000000000444 12180 1727204078.97794: variable 'ansible_search_path' from source: unknown 12180 1727204078.97797: variable 'ansible_search_path' from source: unknown 12180 1727204078.97844: calling self._execute() 12180 1727204078.97950: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.97953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.97969: variable 'omit' from source: magic vars 12180 1727204078.98358: variable 'ansible_distribution_major_version' from source: facts 12180 1727204078.98378: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204078.98384: variable 'omit' from source: magic vars 12180 1727204078.98426: variable 'omit' from source: magic vars 12180 1727204078.98537: variable 'profile' from source: include params 12180 1727204078.98541: variable 'item' from source: include params 12180 1727204078.98610: variable 'item' from source: include params 12180 1727204078.98628: variable 'omit' from source: magic vars 12180 1727204078.98674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204078.98719: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204078.98743: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204078.98759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204078.98772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204078.98812: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204078.98815: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.98817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.98932: Set connection var ansible_pipelining to False 12180 1727204078.98938: Set connection var ansible_shell_type to sh 12180 1727204078.98946: Set connection var ansible_timeout to 10 12180 1727204078.98951: Set connection var ansible_connection to ssh 12180 1727204078.98959: Set connection var ansible_shell_executable to /bin/sh 12180 1727204078.98962: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204078.98989: variable 'ansible_shell_executable' from source: unknown 12180 1727204078.98992: variable 'ansible_connection' from source: unknown 12180 1727204078.98995: variable 'ansible_module_compression' from source: unknown 12180 1727204078.98997: variable 'ansible_shell_type' from source: unknown 12180 1727204078.98999: variable 'ansible_shell_executable' from source: unknown 12180 1727204078.99002: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204078.99011: variable 'ansible_pipelining' from source: unknown 12180 1727204078.99014: variable 'ansible_timeout' from source: unknown 12180 1727204078.99022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204078.99245: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204078.99255: variable 'omit' from source: magic vars 12180 1727204078.99267: starting attempt loop 12180 1727204078.99270: running the handler 12180 1727204078.99279: _low_level_execute_command(): starting 12180 1727204078.99288: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204079.00118: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204079.00137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.00148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.00169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.00208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.00217: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204079.00236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.00253: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204079.00262: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204079.00271: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204079.00280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.00290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.00302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.00310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.00316: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204079.00326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.00410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204079.00431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204079.00449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204079.00557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204079.02180: stdout chunk (state=3): >>>/root <<< 12180 1727204079.02357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204079.02362: stdout chunk (state=3): >>><<< 12180 1727204079.02374: stderr chunk (state=3): >>><<< 12180 1727204079.02396: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204079.02412: _low_level_execute_command(): starting 12180 1727204079.02416: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204079.023958-14898-7115464943457 `" && echo ansible-tmp-1727204079.023958-14898-7115464943457="` echo /root/.ansible/tmp/ansible-tmp-1727204079.023958-14898-7115464943457 `" ) && sleep 0' 12180 1727204079.03069: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204079.03078: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.03089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.03103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.03145: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.03153: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204079.03163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.03180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204079.03189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204079.03194: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204079.03206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.03212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.03223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.03231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.03241: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204079.03250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.03321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204079.03340: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204079.03350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204079.03440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204079.05281: stdout chunk (state=3): >>>ansible-tmp-1727204079.023958-14898-7115464943457=/root/.ansible/tmp/ansible-tmp-1727204079.023958-14898-7115464943457 <<< 12180 1727204079.05403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204079.05500: stderr chunk (state=3): >>><<< 12180 1727204079.05505: stdout chunk (state=3): >>><<< 12180 1727204079.05527: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204079.023958-14898-7115464943457=/root/.ansible/tmp/ansible-tmp-1727204079.023958-14898-7115464943457 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204079.05578: variable 'ansible_module_compression' from source: unknown 12180 1727204079.05643: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12180 1727204079.05684: variable 'ansible_facts' from source: unknown 12180 1727204079.05759: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204079.023958-14898-7115464943457/AnsiballZ_stat.py 12180 1727204079.05908: Sending initial data 12180 1727204079.05911: Sent initial data (150 bytes) 12180 1727204079.06910: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204079.06918: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.06929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.06946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.06988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.06996: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204079.07002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.07016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204079.07023: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204079.07030: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204079.07042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.07050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.07061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.07074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.07081: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204079.07090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.07165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204079.07184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204079.07194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204079.07447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204079.09194: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204079.09243: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204079.09296: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpqmldx_cf /root/.ansible/tmp/ansible-tmp-1727204079.023958-14898-7115464943457/AnsiballZ_stat.py <<< 12180 1727204079.09350: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204079.10586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204079.10849: stderr chunk (state=3): >>><<< 12180 1727204079.10854: stdout chunk (state=3): >>><<< 12180 1727204079.10857: done transferring module to remote 12180 1727204079.10859: _low_level_execute_command(): starting 12180 1727204079.10874: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204079.023958-14898-7115464943457/ /root/.ansible/tmp/ansible-tmp-1727204079.023958-14898-7115464943457/AnsiballZ_stat.py && sleep 0' 12180 1727204079.11473: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.11477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.11515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.11519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.11521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.11601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204079.11620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204079.11708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204079.13422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204079.13533: stderr chunk (state=3): >>><<< 12180 1727204079.13545: stdout chunk (state=3): >>><<< 12180 1727204079.13663: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204079.13669: _low_level_execute_command(): starting 12180 1727204079.13672: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204079.023958-14898-7115464943457/AnsiballZ_stat.py && sleep 0' 12180 1727204079.14381: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204079.14398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.14416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.14442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.14496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.14514: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204079.14538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.14558: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204079.14573: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204079.14586: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204079.14599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.14614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.14662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.14684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.14696: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204079.14803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.14884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204079.14916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204079.14933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204079.15083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204079.28087: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12180 1727204079.29125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204079.29129: stdout chunk (state=3): >>><<< 12180 1727204079.29131: stderr chunk (state=3): >>><<< 12180 1727204079.29169: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204079.29272: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204079.023958-14898-7115464943457/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204079.29280: _low_level_execute_command(): starting 12180 1727204079.29282: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204079.023958-14898-7115464943457/ > /dev/null 2>&1 && sleep 0' 12180 1727204079.29893: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204079.29906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.29932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.29955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.30003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.30021: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204079.30042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.30066: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204079.30079: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204079.30090: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204079.30102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.30116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.30134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.30153: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.30170: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204079.30187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.30272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204079.30299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204079.30317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204079.30405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204079.32216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204079.32328: stderr chunk (state=3): >>><<< 12180 1727204079.32356: stdout chunk (state=3): >>><<< 12180 1727204079.32570: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204079.32574: handler run complete 12180 1727204079.32577: attempt loop complete, returning result 12180 1727204079.32580: _execute() done 12180 1727204079.32582: dumping result to json 12180 1727204079.32584: done dumping result, returning 12180 1727204079.32586: done running TaskExecutor() for managed-node1/TASK: Stat profile file [0affcd87-79f5-ccb1-55ae-000000000444] 12180 1727204079.32588: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000444 12180 1727204079.32668: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000444 12180 1727204079.32672: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 12180 1727204079.32737: no more pending results, returning what we have 12180 1727204079.32742: results queue empty 12180 1727204079.32743: checking for any_errors_fatal 12180 1727204079.32749: done checking for any_errors_fatal 12180 1727204079.32750: checking for max_fail_percentage 12180 1727204079.32751: done checking for max_fail_percentage 12180 1727204079.32752: checking to see if all hosts have failed and the running result is not ok 12180 1727204079.32753: done checking to see if all hosts have failed 12180 1727204079.32754: getting the remaining hosts for this loop 12180 1727204079.32755: done getting the remaining hosts for this loop 12180 1727204079.32759: getting the next task for host managed-node1 12180 1727204079.32769: done getting next task for host managed-node1 12180 1727204079.32772: ^ task is: TASK: Set NM profile exist flag based on the profile files 12180 1727204079.32777: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204079.32781: getting variables 12180 1727204079.32783: in VariableManager get_vars() 12180 1727204079.32828: Calling all_inventory to load vars for managed-node1 12180 1727204079.32832: Calling groups_inventory to load vars for managed-node1 12180 1727204079.32834: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204079.32847: Calling all_plugins_play to load vars for managed-node1 12180 1727204079.32850: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204079.32853: Calling groups_plugins_play to load vars for managed-node1 12180 1727204079.34738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204079.36670: done with get_vars() 12180 1727204079.36707: done getting variables 12180 1727204079.36771: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.397) 0:00:26.780 ***** 12180 1727204079.36813: entering _queue_task() for managed-node1/set_fact 12180 1727204079.37160: worker is 1 (out of 1 available) 12180 1727204079.37174: exiting _queue_task() for managed-node1/set_fact 12180 1727204079.37187: done queuing things up, now waiting for results queue to drain 12180 1727204079.37188: waiting for pending results... 12180 1727204079.37485: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 12180 1727204079.37619: in run() - task 0affcd87-79f5-ccb1-55ae-000000000445 12180 1727204079.37646: variable 'ansible_search_path' from source: unknown 12180 1727204079.37655: variable 'ansible_search_path' from source: unknown 12180 1727204079.37701: calling self._execute() 12180 1727204079.37807: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204079.37819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204079.37833: variable 'omit' from source: magic vars 12180 1727204079.38217: variable 'ansible_distribution_major_version' from source: facts 12180 1727204079.38235: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204079.38371: variable 'profile_stat' from source: set_fact 12180 1727204079.38396: Evaluated conditional (profile_stat.stat.exists): False 12180 1727204079.38404: when evaluation is False, skipping this task 12180 1727204079.38413: _execute() done 12180 1727204079.38425: dumping result to json 12180 1727204079.38433: done dumping result, returning 12180 1727204079.38444: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-ccb1-55ae-000000000445] 12180 1727204079.38456: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000445 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12180 1727204079.38611: no more pending results, returning what we have 12180 1727204079.38615: results queue empty 12180 1727204079.38616: checking for any_errors_fatal 12180 1727204079.38624: done checking for any_errors_fatal 12180 1727204079.38625: checking for max_fail_percentage 12180 1727204079.38627: done checking for max_fail_percentage 12180 1727204079.38627: checking to see if all hosts have failed and the running result is not ok 12180 1727204079.38629: done checking to see if all hosts have failed 12180 1727204079.38629: getting the remaining hosts for this loop 12180 1727204079.38631: done getting the remaining hosts for this loop 12180 1727204079.38635: getting the next task for host managed-node1 12180 1727204079.38643: done getting next task for host managed-node1 12180 1727204079.38646: ^ task is: TASK: Get NM profile info 12180 1727204079.38652: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204079.38657: getting variables 12180 1727204079.38659: in VariableManager get_vars() 12180 1727204079.38709: Calling all_inventory to load vars for managed-node1 12180 1727204079.38712: Calling groups_inventory to load vars for managed-node1 12180 1727204079.38715: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204079.38730: Calling all_plugins_play to load vars for managed-node1 12180 1727204079.38733: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204079.38736: Calling groups_plugins_play to load vars for managed-node1 12180 1727204079.39722: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000445 12180 1727204079.39726: WORKER PROCESS EXITING 12180 1727204079.40638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204079.41728: done with get_vars() 12180 1727204079.41750: done getting variables 12180 1727204079.41796: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.050) 0:00:26.830 ***** 12180 1727204079.41822: entering _queue_task() for managed-node1/shell 12180 1727204079.42055: worker is 1 (out of 1 available) 12180 1727204079.42072: exiting _queue_task() for managed-node1/shell 12180 1727204079.42086: done queuing things up, now waiting for results queue to drain 12180 1727204079.42088: waiting for pending results... 12180 1727204079.42272: running TaskExecutor() for managed-node1/TASK: Get NM profile info 12180 1727204079.42356: in run() - task 0affcd87-79f5-ccb1-55ae-000000000446 12180 1727204079.42370: variable 'ansible_search_path' from source: unknown 12180 1727204079.42374: variable 'ansible_search_path' from source: unknown 12180 1727204079.42403: calling self._execute() 12180 1727204079.42481: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204079.42485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204079.42493: variable 'omit' from source: magic vars 12180 1727204079.42852: variable 'ansible_distribution_major_version' from source: facts 12180 1727204079.42874: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204079.42885: variable 'omit' from source: magic vars 12180 1727204079.42931: variable 'omit' from source: magic vars 12180 1727204079.43046: variable 'profile' from source: include params 12180 1727204079.43068: variable 'item' from source: include params 12180 1727204079.43139: variable 'item' from source: include params 12180 1727204079.43174: variable 'omit' from source: magic vars 12180 1727204079.43225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204079.43268: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204079.43302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204079.43324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204079.43339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204079.43374: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204079.43391: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204079.43402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204079.43513: Set connection var ansible_pipelining to False 12180 1727204079.43521: Set connection var ansible_shell_type to sh 12180 1727204079.43532: Set connection var ansible_timeout to 10 12180 1727204079.43541: Set connection var ansible_connection to ssh 12180 1727204079.43549: Set connection var ansible_shell_executable to /bin/sh 12180 1727204079.43558: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204079.43591: variable 'ansible_shell_executable' from source: unknown 12180 1727204079.43606: variable 'ansible_connection' from source: unknown 12180 1727204079.43619: variable 'ansible_module_compression' from source: unknown 12180 1727204079.43626: variable 'ansible_shell_type' from source: unknown 12180 1727204079.43633: variable 'ansible_shell_executable' from source: unknown 12180 1727204079.43641: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204079.43649: variable 'ansible_pipelining' from source: unknown 12180 1727204079.43657: variable 'ansible_timeout' from source: unknown 12180 1727204079.43671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204079.43832: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204079.43849: variable 'omit' from source: magic vars 12180 1727204079.43860: starting attempt loop 12180 1727204079.43870: running the handler 12180 1727204079.43885: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204079.43906: _low_level_execute_command(): starting 12180 1727204079.43918: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204079.44509: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.44528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.44541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204079.44556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.44570: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.44615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204079.44633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204079.44698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204079.46288: stdout chunk (state=3): >>>/root <<< 12180 1727204079.46463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204079.46471: stdout chunk (state=3): >>><<< 12180 1727204079.46481: stderr chunk (state=3): >>><<< 12180 1727204079.46505: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204079.46520: _low_level_execute_command(): starting 12180 1727204079.46527: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204079.4650588-14912-110495585758263 `" && echo ansible-tmp-1727204079.4650588-14912-110495585758263="` echo /root/.ansible/tmp/ansible-tmp-1727204079.4650588-14912-110495585758263 `" ) && sleep 0' 12180 1727204079.47177: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204079.47190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.47198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.47210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.47250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.47259: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204079.47276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.47289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204079.47298: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204079.47308: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204079.47311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.47322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.47334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.47341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.47349: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204079.47359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.47435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204079.47454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204079.47468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204079.47568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204079.49424: stdout chunk (state=3): >>>ansible-tmp-1727204079.4650588-14912-110495585758263=/root/.ansible/tmp/ansible-tmp-1727204079.4650588-14912-110495585758263 <<< 12180 1727204079.49609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204079.49613: stdout chunk (state=3): >>><<< 12180 1727204079.49621: stderr chunk (state=3): >>><<< 12180 1727204079.49642: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204079.4650588-14912-110495585758263=/root/.ansible/tmp/ansible-tmp-1727204079.4650588-14912-110495585758263 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204079.49676: variable 'ansible_module_compression' from source: unknown 12180 1727204079.49732: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12180 1727204079.49769: variable 'ansible_facts' from source: unknown 12180 1727204079.49842: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204079.4650588-14912-110495585758263/AnsiballZ_command.py 12180 1727204079.49989: Sending initial data 12180 1727204079.49992: Sent initial data (156 bytes) 12180 1727204079.50946: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204079.50957: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.50968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.50984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.51022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.51032: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204079.51039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.51053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204079.51061: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204079.51077: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204079.51085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.51095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.51107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.51116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.51119: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204079.51132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.51202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204079.51219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204079.51233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204079.51319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204079.53036: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204079.53084: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204079.53140: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmp5_hxsr0f /root/.ansible/tmp/ansible-tmp-1727204079.4650588-14912-110495585758263/AnsiballZ_command.py <<< 12180 1727204079.53191: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204079.54672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204079.54827: stderr chunk (state=3): >>><<< 12180 1727204079.54833: stdout chunk (state=3): >>><<< 12180 1727204079.54836: done transferring module to remote 12180 1727204079.54838: _low_level_execute_command(): starting 12180 1727204079.54840: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204079.4650588-14912-110495585758263/ /root/.ansible/tmp/ansible-tmp-1727204079.4650588-14912-110495585758263/AnsiballZ_command.py && sleep 0' 12180 1727204079.56377: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.56382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.56437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.56450: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204079.56468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.56487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204079.56509: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204079.56520: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204079.56536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.56549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.56565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.56577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.56587: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204079.56598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.56686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204079.56712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204079.56739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204079.56840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204079.58645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204079.58650: stdout chunk (state=3): >>><<< 12180 1727204079.58652: stderr chunk (state=3): >>><<< 12180 1727204079.58758: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204079.58764: _low_level_execute_command(): starting 12180 1727204079.58767: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204079.4650588-14912-110495585758263/AnsiballZ_command.py && sleep 0' 12180 1727204079.59380: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204079.59394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.59408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.59432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.59478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.59490: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204079.59505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.59524: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204079.59547: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204079.59560: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204079.59575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204079.59590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.59607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.59621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204079.59633: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204079.59650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.59736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204079.59760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204079.59785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204079.59891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204079.75483: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:54:39.729909", "end": "2024-09-24 14:54:39.754276", "delta": "0:00:00.024367", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12180 1727204079.76655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204079.76725: stderr chunk (state=3): >>><<< 12180 1727204079.76732: stdout chunk (state=3): >>><<< 12180 1727204079.76752: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:54:39.729909", "end": "2024-09-24 14:54:39.754276", "delta": "0:00:00.024367", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204079.76799: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204079.4650588-14912-110495585758263/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204079.76806: _low_level_execute_command(): starting 12180 1727204079.76812: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204079.4650588-14912-110495585758263/ > /dev/null 2>&1 && sleep 0' 12180 1727204079.77372: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.77376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204079.77406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.77410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204079.77412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204079.77461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204079.77475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204079.77540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204079.79331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204079.79395: stderr chunk (state=3): >>><<< 12180 1727204079.79399: stdout chunk (state=3): >>><<< 12180 1727204079.79418: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204079.79425: handler run complete 12180 1727204079.79459: Evaluated conditional (False): False 12180 1727204079.79469: attempt loop complete, returning result 12180 1727204079.79472: _execute() done 12180 1727204079.79474: dumping result to json 12180 1727204079.79494: done dumping result, returning 12180 1727204079.79497: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [0affcd87-79f5-ccb1-55ae-000000000446] 12180 1727204079.79499: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000446 12180 1727204079.79605: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000446 12180 1727204079.79609: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.024367", "end": "2024-09-24 14:54:39.754276", "rc": 0, "start": "2024-09-24 14:54:39.729909" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 12180 1727204079.79749: no more pending results, returning what we have 12180 1727204079.79753: results queue empty 12180 1727204079.79755: checking for any_errors_fatal 12180 1727204079.79800: done checking for any_errors_fatal 12180 1727204079.79802: checking for max_fail_percentage 12180 1727204079.79805: done checking for max_fail_percentage 12180 1727204079.79805: checking to see if all hosts have failed and the running result is not ok 12180 1727204079.79806: done checking to see if all hosts have failed 12180 1727204079.79807: getting the remaining hosts for this loop 12180 1727204079.79808: done getting the remaining hosts for this loop 12180 1727204079.79812: getting the next task for host managed-node1 12180 1727204079.79819: done getting next task for host managed-node1 12180 1727204079.79821: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12180 1727204079.79825: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204079.79831: getting variables 12180 1727204079.79833: in VariableManager get_vars() 12180 1727204079.79878: Calling all_inventory to load vars for managed-node1 12180 1727204079.79881: Calling groups_inventory to load vars for managed-node1 12180 1727204079.79883: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204079.79895: Calling all_plugins_play to load vars for managed-node1 12180 1727204079.79897: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204079.79899: Calling groups_plugins_play to load vars for managed-node1 12180 1727204079.80760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204079.81785: done with get_vars() 12180 1727204079.81802: done getting variables 12180 1727204079.81849: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.400) 0:00:27.230 ***** 12180 1727204079.81878: entering _queue_task() for managed-node1/set_fact 12180 1727204079.82106: worker is 1 (out of 1 available) 12180 1727204079.82119: exiting _queue_task() for managed-node1/set_fact 12180 1727204079.82134: done queuing things up, now waiting for results queue to drain 12180 1727204079.82135: waiting for pending results... 12180 1727204079.82311: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12180 1727204079.82396: in run() - task 0affcd87-79f5-ccb1-55ae-000000000447 12180 1727204079.82409: variable 'ansible_search_path' from source: unknown 12180 1727204079.82413: variable 'ansible_search_path' from source: unknown 12180 1727204079.82444: calling self._execute() 12180 1727204079.82521: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204079.82524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204079.82536: variable 'omit' from source: magic vars 12180 1727204079.82810: variable 'ansible_distribution_major_version' from source: facts 12180 1727204079.82821: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204079.82916: variable 'nm_profile_exists' from source: set_fact 12180 1727204079.82928: Evaluated conditional (nm_profile_exists.rc == 0): True 12180 1727204079.82935: variable 'omit' from source: magic vars 12180 1727204079.82967: variable 'omit' from source: magic vars 12180 1727204079.82988: variable 'omit' from source: magic vars 12180 1727204079.83023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204079.83053: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204079.83071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204079.83084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204079.83093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204079.83118: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204079.83123: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204079.83125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204079.83198: Set connection var ansible_pipelining to False 12180 1727204079.83201: Set connection var ansible_shell_type to sh 12180 1727204079.83206: Set connection var ansible_timeout to 10 12180 1727204079.83212: Set connection var ansible_connection to ssh 12180 1727204079.83218: Set connection var ansible_shell_executable to /bin/sh 12180 1727204079.83222: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204079.83247: variable 'ansible_shell_executable' from source: unknown 12180 1727204079.83250: variable 'ansible_connection' from source: unknown 12180 1727204079.83253: variable 'ansible_module_compression' from source: unknown 12180 1727204079.83256: variable 'ansible_shell_type' from source: unknown 12180 1727204079.83258: variable 'ansible_shell_executable' from source: unknown 12180 1727204079.83260: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204079.83262: variable 'ansible_pipelining' from source: unknown 12180 1727204079.83268: variable 'ansible_timeout' from source: unknown 12180 1727204079.83270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204079.83373: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204079.83382: variable 'omit' from source: magic vars 12180 1727204079.83387: starting attempt loop 12180 1727204079.83390: running the handler 12180 1727204079.83401: handler run complete 12180 1727204079.83409: attempt loop complete, returning result 12180 1727204079.83412: _execute() done 12180 1727204079.83415: dumping result to json 12180 1727204079.83417: done dumping result, returning 12180 1727204079.83423: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-ccb1-55ae-000000000447] 12180 1727204079.83435: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000447 12180 1727204079.83520: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000447 12180 1727204079.83523: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 12180 1727204079.83581: no more pending results, returning what we have 12180 1727204079.83584: results queue empty 12180 1727204079.83585: checking for any_errors_fatal 12180 1727204079.83598: done checking for any_errors_fatal 12180 1727204079.83598: checking for max_fail_percentage 12180 1727204079.83600: done checking for max_fail_percentage 12180 1727204079.83601: checking to see if all hosts have failed and the running result is not ok 12180 1727204079.83602: done checking to see if all hosts have failed 12180 1727204079.83602: getting the remaining hosts for this loop 12180 1727204079.83604: done getting the remaining hosts for this loop 12180 1727204079.83608: getting the next task for host managed-node1 12180 1727204079.83617: done getting next task for host managed-node1 12180 1727204079.83619: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 12180 1727204079.83623: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204079.83626: getting variables 12180 1727204079.83627: in VariableManager get_vars() 12180 1727204079.83669: Calling all_inventory to load vars for managed-node1 12180 1727204079.83672: Calling groups_inventory to load vars for managed-node1 12180 1727204079.83678: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204079.83689: Calling all_plugins_play to load vars for managed-node1 12180 1727204079.83691: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204079.83694: Calling groups_plugins_play to load vars for managed-node1 12180 1727204079.84486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204079.85394: done with get_vars() 12180 1727204079.85414: done getting variables 12180 1727204079.85456: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204079.85544: variable 'profile' from source: include params 12180 1727204079.85547: variable 'item' from source: include params 12180 1727204079.85590: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.037) 0:00:27.268 ***** 12180 1727204079.85619: entering _queue_task() for managed-node1/command 12180 1727204079.85840: worker is 1 (out of 1 available) 12180 1727204079.85852: exiting _queue_task() for managed-node1/command 12180 1727204079.85865: done queuing things up, now waiting for results queue to drain 12180 1727204079.85867: waiting for pending results... 12180 1727204079.86045: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 12180 1727204079.86125: in run() - task 0affcd87-79f5-ccb1-55ae-000000000449 12180 1727204079.86138: variable 'ansible_search_path' from source: unknown 12180 1727204079.86141: variable 'ansible_search_path' from source: unknown 12180 1727204079.86174: calling self._execute() 12180 1727204079.86245: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204079.86248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204079.86257: variable 'omit' from source: magic vars 12180 1727204079.86527: variable 'ansible_distribution_major_version' from source: facts 12180 1727204079.86541: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204079.86628: variable 'profile_stat' from source: set_fact 12180 1727204079.86640: Evaluated conditional (profile_stat.stat.exists): False 12180 1727204079.86643: when evaluation is False, skipping this task 12180 1727204079.86646: _execute() done 12180 1727204079.86649: dumping result to json 12180 1727204079.86651: done dumping result, returning 12180 1727204079.86658: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0affcd87-79f5-ccb1-55ae-000000000449] 12180 1727204079.86663: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000449 12180 1727204079.86746: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000449 12180 1727204079.86749: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12180 1727204079.86801: no more pending results, returning what we have 12180 1727204079.86806: results queue empty 12180 1727204079.86807: checking for any_errors_fatal 12180 1727204079.86814: done checking for any_errors_fatal 12180 1727204079.86814: checking for max_fail_percentage 12180 1727204079.86817: done checking for max_fail_percentage 12180 1727204079.86817: checking to see if all hosts have failed and the running result is not ok 12180 1727204079.86818: done checking to see if all hosts have failed 12180 1727204079.86819: getting the remaining hosts for this loop 12180 1727204079.86820: done getting the remaining hosts for this loop 12180 1727204079.86824: getting the next task for host managed-node1 12180 1727204079.86830: done getting next task for host managed-node1 12180 1727204079.86833: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 12180 1727204079.86837: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204079.86840: getting variables 12180 1727204079.86842: in VariableManager get_vars() 12180 1727204079.86887: Calling all_inventory to load vars for managed-node1 12180 1727204079.86890: Calling groups_inventory to load vars for managed-node1 12180 1727204079.86893: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204079.86902: Calling all_plugins_play to load vars for managed-node1 12180 1727204079.86904: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204079.86907: Calling groups_plugins_play to load vars for managed-node1 12180 1727204079.87812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204079.88714: done with get_vars() 12180 1727204079.88734: done getting variables 12180 1727204079.88781: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204079.88867: variable 'profile' from source: include params 12180 1727204079.88870: variable 'item' from source: include params 12180 1727204079.88910: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.033) 0:00:27.301 ***** 12180 1727204079.88935: entering _queue_task() for managed-node1/set_fact 12180 1727204079.89169: worker is 1 (out of 1 available) 12180 1727204079.89183: exiting _queue_task() for managed-node1/set_fact 12180 1727204079.89195: done queuing things up, now waiting for results queue to drain 12180 1727204079.89197: waiting for pending results... 12180 1727204079.89386: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 12180 1727204079.89463: in run() - task 0affcd87-79f5-ccb1-55ae-00000000044a 12180 1727204079.89477: variable 'ansible_search_path' from source: unknown 12180 1727204079.89480: variable 'ansible_search_path' from source: unknown 12180 1727204079.89508: calling self._execute() 12180 1727204079.89589: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204079.89593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204079.89603: variable 'omit' from source: magic vars 12180 1727204079.89877: variable 'ansible_distribution_major_version' from source: facts 12180 1727204079.89887: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204079.89976: variable 'profile_stat' from source: set_fact 12180 1727204079.89987: Evaluated conditional (profile_stat.stat.exists): False 12180 1727204079.89990: when evaluation is False, skipping this task 12180 1727204079.89993: _execute() done 12180 1727204079.89996: dumping result to json 12180 1727204079.89998: done dumping result, returning 12180 1727204079.90004: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0affcd87-79f5-ccb1-55ae-00000000044a] 12180 1727204079.90016: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000044a 12180 1727204079.90098: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000044a 12180 1727204079.90101: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12180 1727204079.90153: no more pending results, returning what we have 12180 1727204079.90157: results queue empty 12180 1727204079.90158: checking for any_errors_fatal 12180 1727204079.90163: done checking for any_errors_fatal 12180 1727204079.90165: checking for max_fail_percentage 12180 1727204079.90167: done checking for max_fail_percentage 12180 1727204079.90168: checking to see if all hosts have failed and the running result is not ok 12180 1727204079.90168: done checking to see if all hosts have failed 12180 1727204079.90169: getting the remaining hosts for this loop 12180 1727204079.90170: done getting the remaining hosts for this loop 12180 1727204079.90174: getting the next task for host managed-node1 12180 1727204079.90182: done getting next task for host managed-node1 12180 1727204079.90184: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 12180 1727204079.90188: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204079.90193: getting variables 12180 1727204079.90194: in VariableManager get_vars() 12180 1727204079.90240: Calling all_inventory to load vars for managed-node1 12180 1727204079.90243: Calling groups_inventory to load vars for managed-node1 12180 1727204079.90244: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204079.90255: Calling all_plugins_play to load vars for managed-node1 12180 1727204079.90257: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204079.90260: Calling groups_plugins_play to load vars for managed-node1 12180 1727204079.91073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204079.91990: done with get_vars() 12180 1727204079.92009: done getting variables 12180 1727204079.92052: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204079.92137: variable 'profile' from source: include params 12180 1727204079.92140: variable 'item' from source: include params 12180 1727204079.92182: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.032) 0:00:27.334 ***** 12180 1727204079.92206: entering _queue_task() for managed-node1/command 12180 1727204079.92437: worker is 1 (out of 1 available) 12180 1727204079.92450: exiting _queue_task() for managed-node1/command 12180 1727204079.92462: done queuing things up, now waiting for results queue to drain 12180 1727204079.92466: waiting for pending results... 12180 1727204079.92650: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 12180 1727204079.92736: in run() - task 0affcd87-79f5-ccb1-55ae-00000000044b 12180 1727204079.92747: variable 'ansible_search_path' from source: unknown 12180 1727204079.92751: variable 'ansible_search_path' from source: unknown 12180 1727204079.92781: calling self._execute() 12180 1727204079.92854: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204079.92857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204079.92870: variable 'omit' from source: magic vars 12180 1727204079.93130: variable 'ansible_distribution_major_version' from source: facts 12180 1727204079.93143: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204079.93232: variable 'profile_stat' from source: set_fact 12180 1727204079.93244: Evaluated conditional (profile_stat.stat.exists): False 12180 1727204079.93247: when evaluation is False, skipping this task 12180 1727204079.93250: _execute() done 12180 1727204079.93252: dumping result to json 12180 1727204079.93254: done dumping result, returning 12180 1727204079.93262: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0affcd87-79f5-ccb1-55ae-00000000044b] 12180 1727204079.93268: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000044b 12180 1727204079.93349: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000044b 12180 1727204079.93352: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12180 1727204079.93408: no more pending results, returning what we have 12180 1727204079.93412: results queue empty 12180 1727204079.93413: checking for any_errors_fatal 12180 1727204079.93419: done checking for any_errors_fatal 12180 1727204079.93420: checking for max_fail_percentage 12180 1727204079.93422: done checking for max_fail_percentage 12180 1727204079.93423: checking to see if all hosts have failed and the running result is not ok 12180 1727204079.93424: done checking to see if all hosts have failed 12180 1727204079.93424: getting the remaining hosts for this loop 12180 1727204079.93426: done getting the remaining hosts for this loop 12180 1727204079.93429: getting the next task for host managed-node1 12180 1727204079.93436: done getting next task for host managed-node1 12180 1727204079.93438: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 12180 1727204079.93442: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204079.93445: getting variables 12180 1727204079.93447: in VariableManager get_vars() 12180 1727204079.93491: Calling all_inventory to load vars for managed-node1 12180 1727204079.93494: Calling groups_inventory to load vars for managed-node1 12180 1727204079.93496: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204079.93506: Calling all_plugins_play to load vars for managed-node1 12180 1727204079.93508: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204079.93511: Calling groups_plugins_play to load vars for managed-node1 12180 1727204079.97846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204079.98753: done with get_vars() 12180 1727204079.98775: done getting variables 12180 1727204079.98813: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204079.98888: variable 'profile' from source: include params 12180 1727204079.98890: variable 'item' from source: include params 12180 1727204079.98932: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:39 -0400 (0:00:00.067) 0:00:27.401 ***** 12180 1727204079.98953: entering _queue_task() for managed-node1/set_fact 12180 1727204079.99191: worker is 1 (out of 1 available) 12180 1727204079.99205: exiting _queue_task() for managed-node1/set_fact 12180 1727204079.99217: done queuing things up, now waiting for results queue to drain 12180 1727204079.99219: waiting for pending results... 12180 1727204079.99405: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 12180 1727204079.99485: in run() - task 0affcd87-79f5-ccb1-55ae-00000000044c 12180 1727204079.99497: variable 'ansible_search_path' from source: unknown 12180 1727204079.99500: variable 'ansible_search_path' from source: unknown 12180 1727204079.99531: calling self._execute() 12180 1727204079.99603: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204079.99607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204079.99617: variable 'omit' from source: magic vars 12180 1727204079.99901: variable 'ansible_distribution_major_version' from source: facts 12180 1727204079.99917: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204080.00002: variable 'profile_stat' from source: set_fact 12180 1727204080.00015: Evaluated conditional (profile_stat.stat.exists): False 12180 1727204080.00019: when evaluation is False, skipping this task 12180 1727204080.00023: _execute() done 12180 1727204080.00026: dumping result to json 12180 1727204080.00029: done dumping result, returning 12180 1727204080.00034: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0affcd87-79f5-ccb1-55ae-00000000044c] 12180 1727204080.00037: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000044c 12180 1727204080.00128: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000044c 12180 1727204080.00133: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12180 1727204080.00184: no more pending results, returning what we have 12180 1727204080.00187: results queue empty 12180 1727204080.00188: checking for any_errors_fatal 12180 1727204080.00195: done checking for any_errors_fatal 12180 1727204080.00195: checking for max_fail_percentage 12180 1727204080.00197: done checking for max_fail_percentage 12180 1727204080.00198: checking to see if all hosts have failed and the running result is not ok 12180 1727204080.00199: done checking to see if all hosts have failed 12180 1727204080.00200: getting the remaining hosts for this loop 12180 1727204080.00201: done getting the remaining hosts for this loop 12180 1727204080.00205: getting the next task for host managed-node1 12180 1727204080.00214: done getting next task for host managed-node1 12180 1727204080.00217: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 12180 1727204080.00219: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204080.00223: getting variables 12180 1727204080.00224: in VariableManager get_vars() 12180 1727204080.00273: Calling all_inventory to load vars for managed-node1 12180 1727204080.00276: Calling groups_inventory to load vars for managed-node1 12180 1727204080.00278: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204080.00289: Calling all_plugins_play to load vars for managed-node1 12180 1727204080.00291: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204080.00294: Calling groups_plugins_play to load vars for managed-node1 12180 1727204080.01090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204080.02028: done with get_vars() 12180 1727204080.02046: done getting variables 12180 1727204080.02128: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204080.02281: variable 'profile' from source: include params 12180 1727204080.02283: variable 'item' from source: include params 12180 1727204080.02322: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.033) 0:00:27.435 ***** 12180 1727204080.02348: entering _queue_task() for managed-node1/assert 12180 1727204080.02569: worker is 1 (out of 1 available) 12180 1727204080.02582: exiting _queue_task() for managed-node1/assert 12180 1727204080.02595: done queuing things up, now waiting for results queue to drain 12180 1727204080.02596: waiting for pending results... 12180 1727204080.02781: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.1' 12180 1727204080.02859: in run() - task 0affcd87-79f5-ccb1-55ae-00000000026f 12180 1727204080.02872: variable 'ansible_search_path' from source: unknown 12180 1727204080.02875: variable 'ansible_search_path' from source: unknown 12180 1727204080.02909: calling self._execute() 12180 1727204080.02982: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204080.02986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204080.02998: variable 'omit' from source: magic vars 12180 1727204080.03272: variable 'ansible_distribution_major_version' from source: facts 12180 1727204080.03282: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204080.03288: variable 'omit' from source: magic vars 12180 1727204080.03314: variable 'omit' from source: magic vars 12180 1727204080.03389: variable 'profile' from source: include params 12180 1727204080.03396: variable 'item' from source: include params 12180 1727204080.03794: variable 'item' from source: include params 12180 1727204080.03797: variable 'omit' from source: magic vars 12180 1727204080.03800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204080.03803: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204080.03806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204080.03808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204080.03811: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204080.03813: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204080.03816: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204080.03818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204080.03820: Set connection var ansible_pipelining to False 12180 1727204080.03823: Set connection var ansible_shell_type to sh 12180 1727204080.03825: Set connection var ansible_timeout to 10 12180 1727204080.03827: Set connection var ansible_connection to ssh 12180 1727204080.03832: Set connection var ansible_shell_executable to /bin/sh 12180 1727204080.03835: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204080.03837: variable 'ansible_shell_executable' from source: unknown 12180 1727204080.03839: variable 'ansible_connection' from source: unknown 12180 1727204080.03841: variable 'ansible_module_compression' from source: unknown 12180 1727204080.03844: variable 'ansible_shell_type' from source: unknown 12180 1727204080.03846: variable 'ansible_shell_executable' from source: unknown 12180 1727204080.03848: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204080.03851: variable 'ansible_pipelining' from source: unknown 12180 1727204080.03853: variable 'ansible_timeout' from source: unknown 12180 1727204080.03855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204080.03992: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204080.03996: variable 'omit' from source: magic vars 12180 1727204080.03999: starting attempt loop 12180 1727204080.04001: running the handler 12180 1727204080.04051: variable 'lsr_net_profile_exists' from source: set_fact 12180 1727204080.04055: Evaluated conditional (lsr_net_profile_exists): True 12180 1727204080.04063: handler run complete 12180 1727204080.04078: attempt loop complete, returning result 12180 1727204080.04081: _execute() done 12180 1727204080.04084: dumping result to json 12180 1727204080.04086: done dumping result, returning 12180 1727204080.04095: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.1' [0affcd87-79f5-ccb1-55ae-00000000026f] 12180 1727204080.04097: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000026f 12180 1727204080.04188: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000026f 12180 1727204080.04191: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12180 1727204080.04430: no more pending results, returning what we have 12180 1727204080.04434: results queue empty 12180 1727204080.04435: checking for any_errors_fatal 12180 1727204080.04440: done checking for any_errors_fatal 12180 1727204080.04440: checking for max_fail_percentage 12180 1727204080.04442: done checking for max_fail_percentage 12180 1727204080.04443: checking to see if all hosts have failed and the running result is not ok 12180 1727204080.04444: done checking to see if all hosts have failed 12180 1727204080.04445: getting the remaining hosts for this loop 12180 1727204080.04446: done getting the remaining hosts for this loop 12180 1727204080.04449: getting the next task for host managed-node1 12180 1727204080.04455: done getting next task for host managed-node1 12180 1727204080.04457: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 12180 1727204080.04460: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204080.04466: getting variables 12180 1727204080.04467: in VariableManager get_vars() 12180 1727204080.04505: Calling all_inventory to load vars for managed-node1 12180 1727204080.04508: Calling groups_inventory to load vars for managed-node1 12180 1727204080.04511: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204080.04521: Calling all_plugins_play to load vars for managed-node1 12180 1727204080.04523: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204080.04528: Calling groups_plugins_play to load vars for managed-node1 12180 1727204080.05938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204080.06858: done with get_vars() 12180 1727204080.06879: done getting variables 12180 1727204080.06923: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204080.07012: variable 'profile' from source: include params 12180 1727204080.07015: variable 'item' from source: include params 12180 1727204080.07080: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.047) 0:00:27.483 ***** 12180 1727204080.07115: entering _queue_task() for managed-node1/assert 12180 1727204080.07491: worker is 1 (out of 1 available) 12180 1727204080.07504: exiting _queue_task() for managed-node1/assert 12180 1727204080.07516: done queuing things up, now waiting for results queue to drain 12180 1727204080.07518: waiting for pending results... 12180 1727204080.07989: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' 12180 1727204080.07995: in run() - task 0affcd87-79f5-ccb1-55ae-000000000270 12180 1727204080.07999: variable 'ansible_search_path' from source: unknown 12180 1727204080.08002: variable 'ansible_search_path' from source: unknown 12180 1727204080.08004: calling self._execute() 12180 1727204080.08052: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204080.08057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204080.08072: variable 'omit' from source: magic vars 12180 1727204080.08445: variable 'ansible_distribution_major_version' from source: facts 12180 1727204080.08457: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204080.08465: variable 'omit' from source: magic vars 12180 1727204080.08509: variable 'omit' from source: magic vars 12180 1727204080.08615: variable 'profile' from source: include params 12180 1727204080.08626: variable 'item' from source: include params 12180 1727204080.08690: variable 'item' from source: include params 12180 1727204080.08709: variable 'omit' from source: magic vars 12180 1727204080.08756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204080.08793: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204080.08819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204080.08842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204080.08855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204080.08889: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204080.08892: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204080.08895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204080.08999: Set connection var ansible_pipelining to False 12180 1727204080.09002: Set connection var ansible_shell_type to sh 12180 1727204080.09008: Set connection var ansible_timeout to 10 12180 1727204080.09014: Set connection var ansible_connection to ssh 12180 1727204080.09020: Set connection var ansible_shell_executable to /bin/sh 12180 1727204080.09025: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204080.09057: variable 'ansible_shell_executable' from source: unknown 12180 1727204080.09062: variable 'ansible_connection' from source: unknown 12180 1727204080.09065: variable 'ansible_module_compression' from source: unknown 12180 1727204080.09067: variable 'ansible_shell_type' from source: unknown 12180 1727204080.09070: variable 'ansible_shell_executable' from source: unknown 12180 1727204080.09082: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204080.09084: variable 'ansible_pipelining' from source: unknown 12180 1727204080.09087: variable 'ansible_timeout' from source: unknown 12180 1727204080.09089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204080.09221: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204080.09234: variable 'omit' from source: magic vars 12180 1727204080.09238: starting attempt loop 12180 1727204080.09241: running the handler 12180 1727204080.09353: variable 'lsr_net_profile_ansible_managed' from source: set_fact 12180 1727204080.09357: Evaluated conditional (lsr_net_profile_ansible_managed): True 12180 1727204080.09367: handler run complete 12180 1727204080.09385: attempt loop complete, returning result 12180 1727204080.09388: _execute() done 12180 1727204080.09390: dumping result to json 12180 1727204080.09393: done dumping result, returning 12180 1727204080.09402: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0affcd87-79f5-ccb1-55ae-000000000270] 12180 1727204080.09409: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000270 12180 1727204080.09503: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000270 12180 1727204080.09507: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12180 1727204080.09558: no more pending results, returning what we have 12180 1727204080.09562: results queue empty 12180 1727204080.09565: checking for any_errors_fatal 12180 1727204080.09571: done checking for any_errors_fatal 12180 1727204080.09572: checking for max_fail_percentage 12180 1727204080.09574: done checking for max_fail_percentage 12180 1727204080.09575: checking to see if all hosts have failed and the running result is not ok 12180 1727204080.09576: done checking to see if all hosts have failed 12180 1727204080.09577: getting the remaining hosts for this loop 12180 1727204080.09579: done getting the remaining hosts for this loop 12180 1727204080.09583: getting the next task for host managed-node1 12180 1727204080.09590: done getting next task for host managed-node1 12180 1727204080.09592: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 12180 1727204080.09595: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204080.09600: getting variables 12180 1727204080.09602: in VariableManager get_vars() 12180 1727204080.09647: Calling all_inventory to load vars for managed-node1 12180 1727204080.09651: Calling groups_inventory to load vars for managed-node1 12180 1727204080.09654: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204080.09668: Calling all_plugins_play to load vars for managed-node1 12180 1727204080.09671: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204080.09674: Calling groups_plugins_play to load vars for managed-node1 12180 1727204080.11515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204080.14398: done with get_vars() 12180 1727204080.14424: done getting variables 12180 1727204080.15604: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204080.15733: variable 'profile' from source: include params 12180 1727204080.15738: variable 'item' from source: include params 12180 1727204080.15801: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.087) 0:00:27.570 ***** 12180 1727204080.15840: entering _queue_task() for managed-node1/assert 12180 1727204080.16181: worker is 1 (out of 1 available) 12180 1727204080.16194: exiting _queue_task() for managed-node1/assert 12180 1727204080.16206: done queuing things up, now waiting for results queue to drain 12180 1727204080.16208: waiting for pending results... 12180 1727204080.17393: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.1 12180 1727204080.17532: in run() - task 0affcd87-79f5-ccb1-55ae-000000000271 12180 1727204080.17559: variable 'ansible_search_path' from source: unknown 12180 1727204080.17567: variable 'ansible_search_path' from source: unknown 12180 1727204080.17611: calling self._execute() 12180 1727204080.17721: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204080.17736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204080.17752: variable 'omit' from source: magic vars 12180 1727204080.18121: variable 'ansible_distribution_major_version' from source: facts 12180 1727204080.18145: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204080.18157: variable 'omit' from source: magic vars 12180 1727204080.18212: variable 'omit' from source: magic vars 12180 1727204080.18327: variable 'profile' from source: include params 12180 1727204080.18341: variable 'item' from source: include params 12180 1727204080.18413: variable 'item' from source: include params 12180 1727204080.18446: variable 'omit' from source: magic vars 12180 1727204080.18497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204080.18547: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204080.18579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204080.18602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204080.18618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204080.18660: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204080.18673: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204080.18681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204080.18797: Set connection var ansible_pipelining to False 12180 1727204080.18806: Set connection var ansible_shell_type to sh 12180 1727204080.18819: Set connection var ansible_timeout to 10 12180 1727204080.18828: Set connection var ansible_connection to ssh 12180 1727204080.18842: Set connection var ansible_shell_executable to /bin/sh 12180 1727204080.18853: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204080.18888: variable 'ansible_shell_executable' from source: unknown 12180 1727204080.18896: variable 'ansible_connection' from source: unknown 12180 1727204080.18903: variable 'ansible_module_compression' from source: unknown 12180 1727204080.18910: variable 'ansible_shell_type' from source: unknown 12180 1727204080.18917: variable 'ansible_shell_executable' from source: unknown 12180 1727204080.18923: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204080.18936: variable 'ansible_pipelining' from source: unknown 12180 1727204080.18948: variable 'ansible_timeout' from source: unknown 12180 1727204080.18986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204080.19332: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204080.19351: variable 'omit' from source: magic vars 12180 1727204080.19362: starting attempt loop 12180 1727204080.19373: running the handler 12180 1727204080.19501: variable 'lsr_net_profile_fingerprint' from source: set_fact 12180 1727204080.19515: Evaluated conditional (lsr_net_profile_fingerprint): True 12180 1727204080.19528: handler run complete 12180 1727204080.19550: attempt loop complete, returning result 12180 1727204080.19558: _execute() done 12180 1727204080.19566: dumping result to json 12180 1727204080.19577: done dumping result, returning 12180 1727204080.19591: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.1 [0affcd87-79f5-ccb1-55ae-000000000271] 12180 1727204080.19600: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000271 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 12180 1727204080.19778: no more pending results, returning what we have 12180 1727204080.19783: results queue empty 12180 1727204080.19784: checking for any_errors_fatal 12180 1727204080.19792: done checking for any_errors_fatal 12180 1727204080.19793: checking for max_fail_percentage 12180 1727204080.19795: done checking for max_fail_percentage 12180 1727204080.19796: checking to see if all hosts have failed and the running result is not ok 12180 1727204080.19797: done checking to see if all hosts have failed 12180 1727204080.19798: getting the remaining hosts for this loop 12180 1727204080.19800: done getting the remaining hosts for this loop 12180 1727204080.19804: getting the next task for host managed-node1 12180 1727204080.19814: done getting next task for host managed-node1 12180 1727204080.19818: ^ task is: TASK: ** TEST check polling interval 12180 1727204080.19820: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204080.19825: getting variables 12180 1727204080.19827: in VariableManager get_vars() 12180 1727204080.19879: Calling all_inventory to load vars for managed-node1 12180 1727204080.19883: Calling groups_inventory to load vars for managed-node1 12180 1727204080.19886: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204080.19899: Calling all_plugins_play to load vars for managed-node1 12180 1727204080.19903: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204080.19906: Calling groups_plugins_play to load vars for managed-node1 12180 1727204080.20938: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000271 12180 1727204080.20941: WORKER PROCESS EXITING 12180 1727204080.21846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204080.23644: done with get_vars() 12180 1727204080.23677: done getting variables 12180 1727204080.23744: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:75 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.079) 0:00:27.649 ***** 12180 1727204080.23779: entering _queue_task() for managed-node1/command 12180 1727204080.24116: worker is 1 (out of 1 available) 12180 1727204080.24128: exiting _queue_task() for managed-node1/command 12180 1727204080.24139: done queuing things up, now waiting for results queue to drain 12180 1727204080.24141: waiting for pending results... 12180 1727204080.24442: running TaskExecutor() for managed-node1/TASK: ** TEST check polling interval 12180 1727204080.24549: in run() - task 0affcd87-79f5-ccb1-55ae-000000000071 12180 1727204080.24572: variable 'ansible_search_path' from source: unknown 12180 1727204080.24623: calling self._execute() 12180 1727204080.24739: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204080.24751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204080.24770: variable 'omit' from source: magic vars 12180 1727204080.25177: variable 'ansible_distribution_major_version' from source: facts 12180 1727204080.25196: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204080.25209: variable 'omit' from source: magic vars 12180 1727204080.25240: variable 'omit' from source: magic vars 12180 1727204080.25354: variable 'controller_device' from source: play vars 12180 1727204080.25384: variable 'omit' from source: magic vars 12180 1727204080.25435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204080.25487: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204080.25514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204080.25540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204080.25561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204080.25604: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204080.25616: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204080.25624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204080.25741: Set connection var ansible_pipelining to False 12180 1727204080.25750: Set connection var ansible_shell_type to sh 12180 1727204080.25762: Set connection var ansible_timeout to 10 12180 1727204080.25778: Set connection var ansible_connection to ssh 12180 1727204080.25789: Set connection var ansible_shell_executable to /bin/sh 12180 1727204080.25801: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204080.25836: variable 'ansible_shell_executable' from source: unknown 12180 1727204080.25844: variable 'ansible_connection' from source: unknown 12180 1727204080.25852: variable 'ansible_module_compression' from source: unknown 12180 1727204080.25858: variable 'ansible_shell_type' from source: unknown 12180 1727204080.25867: variable 'ansible_shell_executable' from source: unknown 12180 1727204080.25876: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204080.25886: variable 'ansible_pipelining' from source: unknown 12180 1727204080.25892: variable 'ansible_timeout' from source: unknown 12180 1727204080.25900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204080.26060: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204080.26081: variable 'omit' from source: magic vars 12180 1727204080.26090: starting attempt loop 12180 1727204080.26101: running the handler 12180 1727204080.26121: _low_level_execute_command(): starting 12180 1727204080.26139: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204080.26973: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204080.26991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.27008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.27033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.27089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.27103: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204080.27118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.27142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204080.27156: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204080.27170: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204080.27184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.27202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.27221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.27236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.27253: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204080.27272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.27351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204080.27382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204080.27400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204080.27539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204080.29150: stdout chunk (state=3): >>>/root <<< 12180 1727204080.29360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204080.29366: stdout chunk (state=3): >>><<< 12180 1727204080.29369: stderr chunk (state=3): >>><<< 12180 1727204080.29494: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204080.29499: _low_level_execute_command(): starting 12180 1727204080.29501: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204080.2939315-14946-12186202079332 `" && echo ansible-tmp-1727204080.2939315-14946-12186202079332="` echo /root/.ansible/tmp/ansible-tmp-1727204080.2939315-14946-12186202079332 `" ) && sleep 0' 12180 1727204080.31741: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204080.31756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.31773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.31797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.31841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.31902: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204080.31916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.31933: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204080.31945: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204080.31955: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204080.31967: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.31980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.31996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.32012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.32022: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204080.32034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.32119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204080.32241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204080.32255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204080.32455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204080.34317: stdout chunk (state=3): >>>ansible-tmp-1727204080.2939315-14946-12186202079332=/root/.ansible/tmp/ansible-tmp-1727204080.2939315-14946-12186202079332 <<< 12180 1727204080.34521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204080.34528: stdout chunk (state=3): >>><<< 12180 1727204080.34530: stderr chunk (state=3): >>><<< 12180 1727204080.34773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204080.2939315-14946-12186202079332=/root/.ansible/tmp/ansible-tmp-1727204080.2939315-14946-12186202079332 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204080.34777: variable 'ansible_module_compression' from source: unknown 12180 1727204080.34779: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12180 1727204080.34781: variable 'ansible_facts' from source: unknown 12180 1727204080.34784: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204080.2939315-14946-12186202079332/AnsiballZ_command.py 12180 1727204080.36337: Sending initial data 12180 1727204080.36342: Sent initial data (155 bytes) 12180 1727204080.38786: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204080.38846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.38866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.38888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.38977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.38989: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204080.39002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.39023: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204080.39056: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204080.39070: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204080.39082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.39095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.39110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.39128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.39167: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204080.39184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.39302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204080.39349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204080.39367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204080.39562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204080.41231: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204080.41276: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204080.41332: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmp0k7ugycp /root/.ansible/tmp/ansible-tmp-1727204080.2939315-14946-12186202079332/AnsiballZ_command.py <<< 12180 1727204080.41384: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204080.42785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204080.42954: stderr chunk (state=3): >>><<< 12180 1727204080.42957: stdout chunk (state=3): >>><<< 12180 1727204080.42958: done transferring module to remote 12180 1727204080.42967: _low_level_execute_command(): starting 12180 1727204080.42969: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204080.2939315-14946-12186202079332/ /root/.ansible/tmp/ansible-tmp-1727204080.2939315-14946-12186202079332/AnsiballZ_command.py && sleep 0' 12180 1727204080.44357: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204080.44513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.44528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.44545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.44592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.44608: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204080.44622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.44639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204080.44650: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204080.44660: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204080.44673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.44685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.44699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.44711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.44727: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204080.44740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.44819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204080.44951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204080.44968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204080.45168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204080.46885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204080.46946: stderr chunk (state=3): >>><<< 12180 1727204080.46949: stdout chunk (state=3): >>><<< 12180 1727204080.46970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204080.47060: _low_level_execute_command(): starting 12180 1727204080.47066: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204080.2939315-14946-12186202079332/AnsiballZ_command.py && sleep 0' 12180 1727204080.48621: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204080.48638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.48655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.48678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.48734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.48807: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204080.48825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.48845: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204080.48858: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204080.48875: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204080.48889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.48908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.48929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.48942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.49028: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204080.49041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.49119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204080.49147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204080.49160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204080.49355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204080.62715: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/deprecated-bond"], "start": "2024-09-24 14:54:40.623151", "end": "2024-09-24 14:54:40.626341", "delta": "0:00:00.003190", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12180 1727204080.63899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204080.63903: stdout chunk (state=3): >>><<< 12180 1727204080.63905: stderr chunk (state=3): >>><<< 12180 1727204080.64043: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/deprecated-bond"], "start": "2024-09-24 14:54:40.623151", "end": "2024-09-24 14:54:40.626341", "delta": "0:00:00.003190", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204080.64053: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204080.2939315-14946-12186202079332/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204080.64056: _low_level_execute_command(): starting 12180 1727204080.64059: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204080.2939315-14946-12186202079332/ > /dev/null 2>&1 && sleep 0' 12180 1727204080.64631: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204080.65084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.65101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.65119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.65161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.65177: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204080.65193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.65212: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204080.65223: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204080.65234: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204080.65246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.65259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.65279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.65292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.65303: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204080.65317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.65581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204080.65598: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204080.65613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204080.65750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204080.67478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204080.67549: stderr chunk (state=3): >>><<< 12180 1727204080.67552: stdout chunk (state=3): >>><<< 12180 1727204080.67678: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204080.67682: handler run complete 12180 1727204080.67684: Evaluated conditional (False): False 12180 1727204080.67813: variable 'result' from source: unknown 12180 1727204080.67834: Evaluated conditional ('110' in result.stdout): True 12180 1727204080.67851: attempt loop complete, returning result 12180 1727204080.67859: _execute() done 12180 1727204080.67868: dumping result to json 12180 1727204080.67879: done dumping result, returning 12180 1727204080.67900: done running TaskExecutor() for managed-node1/TASK: ** TEST check polling interval [0affcd87-79f5-ccb1-55ae-000000000071] 12180 1727204080.67911: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000071 ok: [managed-node1] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/deprecated-bond" ], "delta": "0:00:00.003190", "end": "2024-09-24 14:54:40.626341", "rc": 0, "start": "2024-09-24 14:54:40.623151" } STDOUT: MII Polling Interval (ms): 110 12180 1727204080.68114: no more pending results, returning what we have 12180 1727204080.68118: results queue empty 12180 1727204080.68120: checking for any_errors_fatal 12180 1727204080.68126: done checking for any_errors_fatal 12180 1727204080.68127: checking for max_fail_percentage 12180 1727204080.68130: done checking for max_fail_percentage 12180 1727204080.68130: checking to see if all hosts have failed and the running result is not ok 12180 1727204080.68131: done checking to see if all hosts have failed 12180 1727204080.68132: getting the remaining hosts for this loop 12180 1727204080.68133: done getting the remaining hosts for this loop 12180 1727204080.68139: getting the next task for host managed-node1 12180 1727204080.68146: done getting next task for host managed-node1 12180 1727204080.68149: ^ task is: TASK: ** TEST check IPv4 12180 1727204080.68151: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204080.68155: getting variables 12180 1727204080.68157: in VariableManager get_vars() 12180 1727204080.68203: Calling all_inventory to load vars for managed-node1 12180 1727204080.68206: Calling groups_inventory to load vars for managed-node1 12180 1727204080.68209: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204080.68221: Calling all_plugins_play to load vars for managed-node1 12180 1727204080.68224: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204080.68227: Calling groups_plugins_play to load vars for managed-node1 12180 1727204080.69036: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000071 12180 1727204080.69039: WORKER PROCESS EXITING 12180 1727204080.70849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204080.72886: done with get_vars() 12180 1727204080.72915: done getting variables 12180 1727204080.72992: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:80 Tuesday 24 September 2024 14:54:40 -0400 (0:00:00.492) 0:00:28.142 ***** 12180 1727204080.73029: entering _queue_task() for managed-node1/command 12180 1727204080.73446: worker is 1 (out of 1 available) 12180 1727204080.73459: exiting _queue_task() for managed-node1/command 12180 1727204080.73473: done queuing things up, now waiting for results queue to drain 12180 1727204080.73474: waiting for pending results... 12180 1727204080.73804: running TaskExecutor() for managed-node1/TASK: ** TEST check IPv4 12180 1727204080.73918: in run() - task 0affcd87-79f5-ccb1-55ae-000000000072 12180 1727204080.73952: variable 'ansible_search_path' from source: unknown 12180 1727204080.73996: calling self._execute() 12180 1727204080.74112: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204080.74124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204080.74139: variable 'omit' from source: magic vars 12180 1727204080.74555: variable 'ansible_distribution_major_version' from source: facts 12180 1727204080.74583: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204080.74600: variable 'omit' from source: magic vars 12180 1727204080.74629: variable 'omit' from source: magic vars 12180 1727204080.74740: variable 'controller_device' from source: play vars 12180 1727204080.74765: variable 'omit' from source: magic vars 12180 1727204080.74830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204080.74875: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204080.74901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204080.74933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204080.74949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204080.74990: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204080.75000: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204080.75007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204080.75125: Set connection var ansible_pipelining to False 12180 1727204080.75138: Set connection var ansible_shell_type to sh 12180 1727204080.75154: Set connection var ansible_timeout to 10 12180 1727204080.75163: Set connection var ansible_connection to ssh 12180 1727204080.75174: Set connection var ansible_shell_executable to /bin/sh 12180 1727204080.75187: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204080.75223: variable 'ansible_shell_executable' from source: unknown 12180 1727204080.75235: variable 'ansible_connection' from source: unknown 12180 1727204080.75247: variable 'ansible_module_compression' from source: unknown 12180 1727204080.75257: variable 'ansible_shell_type' from source: unknown 12180 1727204080.75265: variable 'ansible_shell_executable' from source: unknown 12180 1727204080.75272: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204080.75279: variable 'ansible_pipelining' from source: unknown 12180 1727204080.75284: variable 'ansible_timeout' from source: unknown 12180 1727204080.75294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204080.75458: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204080.75484: variable 'omit' from source: magic vars 12180 1727204080.75493: starting attempt loop 12180 1727204080.75499: running the handler 12180 1727204080.75517: _low_level_execute_command(): starting 12180 1727204080.75533: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204080.76641: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204080.76656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.76673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.76693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.76746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.76757: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204080.76775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.76792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204080.76804: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204080.76826: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204080.76840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.76854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.76877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.76890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.76917: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204080.76942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.77024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204080.77051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204080.77069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204080.77237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204080.78796: stdout chunk (state=3): >>>/root <<< 12180 1727204080.78998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204080.79001: stdout chunk (state=3): >>><<< 12180 1727204080.79004: stderr chunk (state=3): >>><<< 12180 1727204080.79128: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204080.79132: _low_level_execute_command(): starting 12180 1727204080.79136: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204080.7903156-14968-222385648457550 `" && echo ansible-tmp-1727204080.7903156-14968-222385648457550="` echo /root/.ansible/tmp/ansible-tmp-1727204080.7903156-14968-222385648457550 `" ) && sleep 0' 12180 1727204080.80052: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204080.80070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.80095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.80114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.80158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.80174: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204080.80197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.80215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204080.80228: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204080.80241: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204080.80253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.80270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.80287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.80308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.80320: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204080.80335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.80420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204080.80444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204080.80462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204080.80561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204080.82406: stdout chunk (state=3): >>>ansible-tmp-1727204080.7903156-14968-222385648457550=/root/.ansible/tmp/ansible-tmp-1727204080.7903156-14968-222385648457550 <<< 12180 1727204080.82584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204080.82631: stderr chunk (state=3): >>><<< 12180 1727204080.82634: stdout chunk (state=3): >>><<< 12180 1727204080.82875: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204080.7903156-14968-222385648457550=/root/.ansible/tmp/ansible-tmp-1727204080.7903156-14968-222385648457550 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204080.82878: variable 'ansible_module_compression' from source: unknown 12180 1727204080.82880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12180 1727204080.82883: variable 'ansible_facts' from source: unknown 12180 1727204080.82885: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204080.7903156-14968-222385648457550/AnsiballZ_command.py 12180 1727204080.83022: Sending initial data 12180 1727204080.83026: Sent initial data (156 bytes) 12180 1727204080.84057: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204080.84082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.84098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.84117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.84161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.84178: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204080.84201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.84220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204080.84233: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204080.84245: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204080.84258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.84275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.84298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.84312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.84323: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204080.84337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.84422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204080.84446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204080.84466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204080.84559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204080.86278: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204080.86327: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204080.86384: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpu4w6o8cc /root/.ansible/tmp/ansible-tmp-1727204080.7903156-14968-222385648457550/AnsiballZ_command.py <<< 12180 1727204080.86435: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204080.87540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204080.87712: stderr chunk (state=3): >>><<< 12180 1727204080.87716: stdout chunk (state=3): >>><<< 12180 1727204080.87744: done transferring module to remote 12180 1727204080.87755: _low_level_execute_command(): starting 12180 1727204080.87760: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204080.7903156-14968-222385648457550/ /root/.ansible/tmp/ansible-tmp-1727204080.7903156-14968-222385648457550/AnsiballZ_command.py && sleep 0' 12180 1727204080.88418: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204080.88428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.88440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.88454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.88494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.88501: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204080.88510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.88523: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204080.88531: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204080.88542: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204080.88551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.88557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.88570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.88577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.88583: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204080.88592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.88666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204080.88684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204080.88695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204080.88774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204080.90485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204080.90581: stderr chunk (state=3): >>><<< 12180 1727204080.90584: stdout chunk (state=3): >>><<< 12180 1727204080.90608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204080.90612: _low_level_execute_command(): starting 12180 1727204080.90619: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204080.7903156-14968-222385648457550/AnsiballZ_command.py && sleep 0' 12180 1727204080.91287: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204080.91297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.91307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.91321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.91373: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.91380: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204080.91390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.91403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204080.91412: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204080.91418: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204080.91426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204080.91438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204080.91450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204080.91467: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204080.91475: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204080.91484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204080.91558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204080.91584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204080.91595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204080.91683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204081.05215: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.15/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "deprecated-bond"], "start": "2024-09-24 14:54:41.048219", "end": "2024-09-24 14:54:41.051561", "delta": "0:00:00.003342", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12180 1727204081.06322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204081.06380: stderr chunk (state=3): >>><<< 12180 1727204081.06383: stdout chunk (state=3): >>><<< 12180 1727204081.06400: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.15/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "deprecated-bond"], "start": "2024-09-24 14:54:41.048219", "end": "2024-09-24 14:54:41.051561", "delta": "0:00:00.003342", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204081.06438: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204080.7903156-14968-222385648457550/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204081.06444: _low_level_execute_command(): starting 12180 1727204081.06449: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204080.7903156-14968-222385648457550/ > /dev/null 2>&1 && sleep 0' 12180 1727204081.06923: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204081.06927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204081.06968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204081.06980: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204081.06984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 12180 1727204081.06993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204081.07043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204081.07055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204081.07121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204081.08902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204081.08959: stderr chunk (state=3): >>><<< 12180 1727204081.08962: stdout chunk (state=3): >>><<< 12180 1727204081.08983: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204081.08988: handler run complete 12180 1727204081.09011: Evaluated conditional (False): False 12180 1727204081.09125: variable 'result' from source: set_fact 12180 1727204081.09139: Evaluated conditional ('192.0.2' in result.stdout): True 12180 1727204081.09149: attempt loop complete, returning result 12180 1727204081.09152: _execute() done 12180 1727204081.09154: dumping result to json 12180 1727204081.09159: done dumping result, returning 12180 1727204081.09168: done running TaskExecutor() for managed-node1/TASK: ** TEST check IPv4 [0affcd87-79f5-ccb1-55ae-000000000072] 12180 1727204081.09173: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000072 12180 1727204081.09279: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000072 12180 1727204081.09282: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "deprecated-bond" ], "delta": "0:00:00.003342", "end": "2024-09-24 14:54:41.051561", "rc": 0, "start": "2024-09-24 14:54:41.048219" } STDOUT: 13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.15/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond valid_lft 236sec preferred_lft 236sec 12180 1727204081.09388: no more pending results, returning what we have 12180 1727204081.09393: results queue empty 12180 1727204081.09394: checking for any_errors_fatal 12180 1727204081.09400: done checking for any_errors_fatal 12180 1727204081.09401: checking for max_fail_percentage 12180 1727204081.09402: done checking for max_fail_percentage 12180 1727204081.09403: checking to see if all hosts have failed and the running result is not ok 12180 1727204081.09404: done checking to see if all hosts have failed 12180 1727204081.09405: getting the remaining hosts for this loop 12180 1727204081.09406: done getting the remaining hosts for this loop 12180 1727204081.09409: getting the next task for host managed-node1 12180 1727204081.09414: done getting next task for host managed-node1 12180 1727204081.09416: ^ task is: TASK: ** TEST check IPv6 12180 1727204081.09418: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204081.09421: getting variables 12180 1727204081.09422: in VariableManager get_vars() 12180 1727204081.09462: Calling all_inventory to load vars for managed-node1 12180 1727204081.09466: Calling groups_inventory to load vars for managed-node1 12180 1727204081.09468: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204081.09478: Calling all_plugins_play to load vars for managed-node1 12180 1727204081.09480: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204081.09482: Calling groups_plugins_play to load vars for managed-node1 12180 1727204081.10299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204081.11227: done with get_vars() 12180 1727204081.11249: done getting variables 12180 1727204081.11294: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:87 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.382) 0:00:28.525 ***** 12180 1727204081.11314: entering _queue_task() for managed-node1/command 12180 1727204081.11548: worker is 1 (out of 1 available) 12180 1727204081.11562: exiting _queue_task() for managed-node1/command 12180 1727204081.11575: done queuing things up, now waiting for results queue to drain 12180 1727204081.11577: waiting for pending results... 12180 1727204081.11757: running TaskExecutor() for managed-node1/TASK: ** TEST check IPv6 12180 1727204081.11824: in run() - task 0affcd87-79f5-ccb1-55ae-000000000073 12180 1727204081.11837: variable 'ansible_search_path' from source: unknown 12180 1727204081.11867: calling self._execute() 12180 1727204081.11945: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204081.11949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204081.11958: variable 'omit' from source: magic vars 12180 1727204081.12236: variable 'ansible_distribution_major_version' from source: facts 12180 1727204081.12245: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204081.12251: variable 'omit' from source: magic vars 12180 1727204081.12268: variable 'omit' from source: magic vars 12180 1727204081.12338: variable 'controller_device' from source: play vars 12180 1727204081.12349: variable 'omit' from source: magic vars 12180 1727204081.12386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204081.12414: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204081.12436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204081.12445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204081.12455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204081.12482: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204081.12485: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204081.12489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204081.12565: Set connection var ansible_pipelining to False 12180 1727204081.12569: Set connection var ansible_shell_type to sh 12180 1727204081.12575: Set connection var ansible_timeout to 10 12180 1727204081.12578: Set connection var ansible_connection to ssh 12180 1727204081.12584: Set connection var ansible_shell_executable to /bin/sh 12180 1727204081.12589: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204081.12610: variable 'ansible_shell_executable' from source: unknown 12180 1727204081.12613: variable 'ansible_connection' from source: unknown 12180 1727204081.12616: variable 'ansible_module_compression' from source: unknown 12180 1727204081.12618: variable 'ansible_shell_type' from source: unknown 12180 1727204081.12621: variable 'ansible_shell_executable' from source: unknown 12180 1727204081.12623: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204081.12626: variable 'ansible_pipelining' from source: unknown 12180 1727204081.12628: variable 'ansible_timeout' from source: unknown 12180 1727204081.12633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204081.12737: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204081.12748: variable 'omit' from source: magic vars 12180 1727204081.12753: starting attempt loop 12180 1727204081.12757: running the handler 12180 1727204081.12773: _low_level_execute_command(): starting 12180 1727204081.12781: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204081.13324: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204081.13359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204081.13368: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204081.13378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204081.13383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204081.13391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204081.13397: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204081.13403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204081.13462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204081.13476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204081.13542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204081.15092: stdout chunk (state=3): >>>/root <<< 12180 1727204081.15197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204081.15264: stderr chunk (state=3): >>><<< 12180 1727204081.15270: stdout chunk (state=3): >>><<< 12180 1727204081.15291: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204081.15302: _low_level_execute_command(): starting 12180 1727204081.15309: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204081.1529088-14982-234614936120412 `" && echo ansible-tmp-1727204081.1529088-14982-234614936120412="` echo /root/.ansible/tmp/ansible-tmp-1727204081.1529088-14982-234614936120412 `" ) && sleep 0' 12180 1727204081.15779: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204081.15784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204081.15824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204081.15838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204081.15842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204081.15848: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204081.15855: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204081.15862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204081.15926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204081.15934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204081.16006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204081.17851: stdout chunk (state=3): >>>ansible-tmp-1727204081.1529088-14982-234614936120412=/root/.ansible/tmp/ansible-tmp-1727204081.1529088-14982-234614936120412 <<< 12180 1727204081.17960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204081.18020: stderr chunk (state=3): >>><<< 12180 1727204081.18024: stdout chunk (state=3): >>><<< 12180 1727204081.18046: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204081.1529088-14982-234614936120412=/root/.ansible/tmp/ansible-tmp-1727204081.1529088-14982-234614936120412 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204081.18075: variable 'ansible_module_compression' from source: unknown 12180 1727204081.18117: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12180 1727204081.18150: variable 'ansible_facts' from source: unknown 12180 1727204081.18217: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204081.1529088-14982-234614936120412/AnsiballZ_command.py 12180 1727204081.18332: Sending initial data 12180 1727204081.18338: Sent initial data (156 bytes) 12180 1727204081.19040: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204081.19053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204081.19078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204081.19083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204081.19092: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204081.19099: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204081.19105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204081.19114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204081.19119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204081.19179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204081.19185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204081.19193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204081.19271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204081.20959: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12180 1727204081.20977: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204081.21013: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204081.21070: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmprn507qyp /root/.ansible/tmp/ansible-tmp-1727204081.1529088-14982-234614936120412/AnsiballZ_command.py <<< 12180 1727204081.21117: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204081.21957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204081.22075: stderr chunk (state=3): >>><<< 12180 1727204081.22078: stdout chunk (state=3): >>><<< 12180 1727204081.22096: done transferring module to remote 12180 1727204081.22105: _low_level_execute_command(): starting 12180 1727204081.22111: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204081.1529088-14982-234614936120412/ /root/.ansible/tmp/ansible-tmp-1727204081.1529088-14982-234614936120412/AnsiballZ_command.py && sleep 0' 12180 1727204081.22583: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204081.22586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204081.22597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204081.22637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204081.22640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204081.22642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204081.22696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204081.22700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204081.22770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204081.24476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204081.24532: stderr chunk (state=3): >>><<< 12180 1727204081.24535: stdout chunk (state=3): >>><<< 12180 1727204081.24551: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204081.24554: _low_level_execute_command(): starting 12180 1727204081.24559: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204081.1529088-14982-234614936120412/AnsiballZ_command.py && sleep 0' 12180 1727204081.25023: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204081.25027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204081.25062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204081.25077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 12180 1727204081.25091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204081.25137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204081.25142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204081.25148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204081.25224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204081.38609: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1c3/128 scope global dynamic noprefixroute \n valid_lft 237sec preferred_lft 237sec\n inet6 2001:db8::74d7:b813:2697:d255/64 scope global dynamic noprefixroute \n valid_lft 1799sec preferred_lft 1799sec\n inet6 fe80::ee37:167b:41c5:61cc/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "deprecated-bond"], "start": "2024-09-24 14:54:41.382186", "end": "2024-09-24 14:54:41.385523", "delta": "0:00:00.003337", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12180 1727204081.39790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204081.39883: stderr chunk (state=3): >>><<< 12180 1727204081.39887: stdout chunk (state=3): >>><<< 12180 1727204081.40047: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::1c3/128 scope global dynamic noprefixroute \n valid_lft 237sec preferred_lft 237sec\n inet6 2001:db8::74d7:b813:2697:d255/64 scope global dynamic noprefixroute \n valid_lft 1799sec preferred_lft 1799sec\n inet6 fe80::ee37:167b:41c5:61cc/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "deprecated-bond"], "start": "2024-09-24 14:54:41.382186", "end": "2024-09-24 14:54:41.385523", "delta": "0:00:00.003337", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204081.40051: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204081.1529088-14982-234614936120412/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204081.40054: _low_level_execute_command(): starting 12180 1727204081.40057: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204081.1529088-14982-234614936120412/ > /dev/null 2>&1 && sleep 0' 12180 1727204081.40727: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204081.40746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204081.40762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204081.40785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204081.40843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204081.40856: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204081.40874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204081.40893: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204081.40906: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204081.40925: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204081.40941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204081.40957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204081.40976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204081.40989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204081.41001: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204081.41016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204081.41101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204081.41125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204081.41151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204081.41244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204081.43025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204081.43119: stderr chunk (state=3): >>><<< 12180 1727204081.43137: stdout chunk (state=3): >>><<< 12180 1727204081.43370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204081.43373: handler run complete 12180 1727204081.43376: Evaluated conditional (False): False 12180 1727204081.43378: variable 'result' from source: set_fact 12180 1727204081.43402: Evaluated conditional ('2001' in result.stdout): True 12180 1727204081.43418: attempt loop complete, returning result 12180 1727204081.43425: _execute() done 12180 1727204081.43435: dumping result to json 12180 1727204081.43447: done dumping result, returning 12180 1727204081.43460: done running TaskExecutor() for managed-node1/TASK: ** TEST check IPv6 [0affcd87-79f5-ccb1-55ae-000000000073] 12180 1727204081.43473: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000073 ok: [managed-node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "deprecated-bond" ], "delta": "0:00:00.003337", "end": "2024-09-24 14:54:41.385523", "rc": 0, "start": "2024-09-24 14:54:41.382186" } STDOUT: 13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::1c3/128 scope global dynamic noprefixroute valid_lft 237sec preferred_lft 237sec inet6 2001:db8::74d7:b813:2697:d255/64 scope global dynamic noprefixroute valid_lft 1799sec preferred_lft 1799sec inet6 fe80::ee37:167b:41c5:61cc/64 scope link noprefixroute valid_lft forever preferred_lft forever 12180 1727204081.43679: no more pending results, returning what we have 12180 1727204081.43683: results queue empty 12180 1727204081.43684: checking for any_errors_fatal 12180 1727204081.43698: done checking for any_errors_fatal 12180 1727204081.43699: checking for max_fail_percentage 12180 1727204081.43701: done checking for max_fail_percentage 12180 1727204081.43702: checking to see if all hosts have failed and the running result is not ok 12180 1727204081.43703: done checking to see if all hosts have failed 12180 1727204081.43704: getting the remaining hosts for this loop 12180 1727204081.43705: done getting the remaining hosts for this loop 12180 1727204081.43710: getting the next task for host managed-node1 12180 1727204081.43723: done getting next task for host managed-node1 12180 1727204081.43732: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12180 1727204081.43737: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204081.43760: getting variables 12180 1727204081.43762: in VariableManager get_vars() 12180 1727204081.43816: Calling all_inventory to load vars for managed-node1 12180 1727204081.43819: Calling groups_inventory to load vars for managed-node1 12180 1727204081.43822: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204081.43836: Calling all_plugins_play to load vars for managed-node1 12180 1727204081.43839: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204081.43842: Calling groups_plugins_play to load vars for managed-node1 12180 1727204081.44612: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000073 12180 1727204081.44616: WORKER PROCESS EXITING 12180 1727204081.45954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204081.47695: done with get_vars() 12180 1727204081.47725: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.365) 0:00:28.890 ***** 12180 1727204081.47837: entering _queue_task() for managed-node1/include_tasks 12180 1727204081.48333: worker is 1 (out of 1 available) 12180 1727204081.48346: exiting _queue_task() for managed-node1/include_tasks 12180 1727204081.48357: done queuing things up, now waiting for results queue to drain 12180 1727204081.48359: waiting for pending results... 12180 1727204081.49000: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12180 1727204081.49381: in run() - task 0affcd87-79f5-ccb1-55ae-00000000007d 12180 1727204081.49410: variable 'ansible_search_path' from source: unknown 12180 1727204081.49419: variable 'ansible_search_path' from source: unknown 12180 1727204081.49522: calling self._execute() 12180 1727204081.49809: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204081.49855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204081.49873: variable 'omit' from source: magic vars 12180 1727204081.50412: variable 'ansible_distribution_major_version' from source: facts 12180 1727204081.50439: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204081.50451: _execute() done 12180 1727204081.50460: dumping result to json 12180 1727204081.50470: done dumping result, returning 12180 1727204081.50481: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-ccb1-55ae-00000000007d] 12180 1727204081.50494: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000007d 12180 1727204081.50663: no more pending results, returning what we have 12180 1727204081.50670: in VariableManager get_vars() 12180 1727204081.50724: Calling all_inventory to load vars for managed-node1 12180 1727204081.50727: Calling groups_inventory to load vars for managed-node1 12180 1727204081.50733: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204081.50747: Calling all_plugins_play to load vars for managed-node1 12180 1727204081.50750: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204081.50754: Calling groups_plugins_play to load vars for managed-node1 12180 1727204081.51807: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000007d 12180 1727204081.51811: WORKER PROCESS EXITING 12180 1727204081.53926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204081.57580: done with get_vars() 12180 1727204081.57612: variable 'ansible_search_path' from source: unknown 12180 1727204081.57614: variable 'ansible_search_path' from source: unknown 12180 1727204081.57663: we have included files to process 12180 1727204081.57666: generating all_blocks data 12180 1727204081.57669: done generating all_blocks data 12180 1727204081.57675: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12180 1727204081.57676: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12180 1727204081.57679: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12180 1727204081.59151: done processing included file 12180 1727204081.59154: iterating over new_blocks loaded from include file 12180 1727204081.59155: in VariableManager get_vars() 12180 1727204081.59187: done with get_vars() 12180 1727204081.59189: filtering new block on tags 12180 1727204081.59223: done filtering new block on tags 12180 1727204081.59226: in VariableManager get_vars() 12180 1727204081.59256: done with get_vars() 12180 1727204081.59257: filtering new block on tags 12180 1727204081.59302: done filtering new block on tags 12180 1727204081.59305: in VariableManager get_vars() 12180 1727204081.59332: done with get_vars() 12180 1727204081.59334: filtering new block on tags 12180 1727204081.59375: done filtering new block on tags 12180 1727204081.59377: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 12180 1727204081.59383: extending task lists for all hosts with included blocks 12180 1727204081.60710: done extending task lists 12180 1727204081.60711: done processing included files 12180 1727204081.60712: results queue empty 12180 1727204081.60713: checking for any_errors_fatal 12180 1727204081.60717: done checking for any_errors_fatal 12180 1727204081.60718: checking for max_fail_percentage 12180 1727204081.60720: done checking for max_fail_percentage 12180 1727204081.60720: checking to see if all hosts have failed and the running result is not ok 12180 1727204081.60721: done checking to see if all hosts have failed 12180 1727204081.60722: getting the remaining hosts for this loop 12180 1727204081.60723: done getting the remaining hosts for this loop 12180 1727204081.60726: getting the next task for host managed-node1 12180 1727204081.60733: done getting next task for host managed-node1 12180 1727204081.60736: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12180 1727204081.60740: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204081.60750: getting variables 12180 1727204081.60751: in VariableManager get_vars() 12180 1727204081.60773: Calling all_inventory to load vars for managed-node1 12180 1727204081.60775: Calling groups_inventory to load vars for managed-node1 12180 1727204081.60777: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204081.60783: Calling all_plugins_play to load vars for managed-node1 12180 1727204081.60785: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204081.60788: Calling groups_plugins_play to load vars for managed-node1 12180 1727204081.61997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204081.66289: done with get_vars() 12180 1727204081.66322: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.185) 0:00:29.076 ***** 12180 1727204081.66411: entering _queue_task() for managed-node1/setup 12180 1727204081.67144: worker is 1 (out of 1 available) 12180 1727204081.67156: exiting _queue_task() for managed-node1/setup 12180 1727204081.67325: done queuing things up, now waiting for results queue to drain 12180 1727204081.67328: waiting for pending results... 12180 1727204081.68196: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12180 1727204081.68613: in run() - task 0affcd87-79f5-ccb1-55ae-000000000494 12180 1727204081.68726: variable 'ansible_search_path' from source: unknown 12180 1727204081.68760: variable 'ansible_search_path' from source: unknown 12180 1727204081.68804: calling self._execute() 12180 1727204081.68966: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204081.69090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204081.69105: variable 'omit' from source: magic vars 12180 1727204081.69835: variable 'ansible_distribution_major_version' from source: facts 12180 1727204081.69912: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204081.70154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204081.74166: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204081.74386: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204081.74546: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204081.74588: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204081.74621: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204081.74707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204081.74878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204081.74907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204081.75070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204081.75091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204081.75147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204081.75196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204081.75308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204081.75354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204081.75406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204081.75687: variable '__network_required_facts' from source: role '' defaults 12180 1727204081.75837: variable 'ansible_facts' from source: unknown 12180 1727204081.77811: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12180 1727204081.77822: when evaluation is False, skipping this task 12180 1727204081.77833: _execute() done 12180 1727204081.77842: dumping result to json 12180 1727204081.77850: done dumping result, returning 12180 1727204081.77863: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-ccb1-55ae-000000000494] 12180 1727204081.77876: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000494 skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12180 1727204081.78041: no more pending results, returning what we have 12180 1727204081.78047: results queue empty 12180 1727204081.78048: checking for any_errors_fatal 12180 1727204081.78050: done checking for any_errors_fatal 12180 1727204081.78051: checking for max_fail_percentage 12180 1727204081.78053: done checking for max_fail_percentage 12180 1727204081.78053: checking to see if all hosts have failed and the running result is not ok 12180 1727204081.78054: done checking to see if all hosts have failed 12180 1727204081.78055: getting the remaining hosts for this loop 12180 1727204081.78057: done getting the remaining hosts for this loop 12180 1727204081.78062: getting the next task for host managed-node1 12180 1727204081.78075: done getting next task for host managed-node1 12180 1727204081.78080: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12180 1727204081.78086: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204081.78107: getting variables 12180 1727204081.78109: in VariableManager get_vars() 12180 1727204081.78157: Calling all_inventory to load vars for managed-node1 12180 1727204081.78160: Calling groups_inventory to load vars for managed-node1 12180 1727204081.78163: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204081.78176: Calling all_plugins_play to load vars for managed-node1 12180 1727204081.78179: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204081.78183: Calling groups_plugins_play to load vars for managed-node1 12180 1727204081.79687: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000494 12180 1727204081.79692: WORKER PROCESS EXITING 12180 1727204081.81536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204081.83923: done with get_vars() 12180 1727204081.83956: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.176) 0:00:29.252 ***** 12180 1727204081.84071: entering _queue_task() for managed-node1/stat 12180 1727204081.84391: worker is 1 (out of 1 available) 12180 1727204081.84405: exiting _queue_task() for managed-node1/stat 12180 1727204081.84417: done queuing things up, now waiting for results queue to drain 12180 1727204081.84419: waiting for pending results... 12180 1727204081.84720: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 12180 1727204081.84898: in run() - task 0affcd87-79f5-ccb1-55ae-000000000496 12180 1727204081.84918: variable 'ansible_search_path' from source: unknown 12180 1727204081.84925: variable 'ansible_search_path' from source: unknown 12180 1727204081.84967: calling self._execute() 12180 1727204081.85062: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204081.85075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204081.85092: variable 'omit' from source: magic vars 12180 1727204081.85469: variable 'ansible_distribution_major_version' from source: facts 12180 1727204081.85487: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204081.85660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204081.85926: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204081.85983: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204081.86021: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204081.86064: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204081.86152: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204081.86187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204081.86213: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204081.86245: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204081.86350: variable '__network_is_ostree' from source: set_fact 12180 1727204081.86362: Evaluated conditional (not __network_is_ostree is defined): False 12180 1727204081.86372: when evaluation is False, skipping this task 12180 1727204081.86379: _execute() done 12180 1727204081.86389: dumping result to json 12180 1727204081.86396: done dumping result, returning 12180 1727204081.86406: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-ccb1-55ae-000000000496] 12180 1727204081.86417: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000496 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12180 1727204081.86570: no more pending results, returning what we have 12180 1727204081.86574: results queue empty 12180 1727204081.86575: checking for any_errors_fatal 12180 1727204081.86583: done checking for any_errors_fatal 12180 1727204081.86583: checking for max_fail_percentage 12180 1727204081.86586: done checking for max_fail_percentage 12180 1727204081.86586: checking to see if all hosts have failed and the running result is not ok 12180 1727204081.86587: done checking to see if all hosts have failed 12180 1727204081.86588: getting the remaining hosts for this loop 12180 1727204081.86589: done getting the remaining hosts for this loop 12180 1727204081.86593: getting the next task for host managed-node1 12180 1727204081.86602: done getting next task for host managed-node1 12180 1727204081.86605: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12180 1727204081.86611: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204081.86634: getting variables 12180 1727204081.86636: in VariableManager get_vars() 12180 1727204081.86681: Calling all_inventory to load vars for managed-node1 12180 1727204081.86684: Calling groups_inventory to load vars for managed-node1 12180 1727204081.86687: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204081.86699: Calling all_plugins_play to load vars for managed-node1 12180 1727204081.86701: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204081.86704: Calling groups_plugins_play to load vars for managed-node1 12180 1727204081.88420: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000496 12180 1727204081.88425: WORKER PROCESS EXITING 12180 1727204081.88920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204081.92669: done with get_vars() 12180 1727204081.92704: done getting variables 12180 1727204081.92884: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:54:41 -0400 (0:00:00.088) 0:00:29.341 ***** 12180 1727204081.92925: entering _queue_task() for managed-node1/set_fact 12180 1727204081.93674: worker is 1 (out of 1 available) 12180 1727204081.93687: exiting _queue_task() for managed-node1/set_fact 12180 1727204081.93703: done queuing things up, now waiting for results queue to drain 12180 1727204081.93705: waiting for pending results... 12180 1727204081.94575: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12180 1727204081.94942: in run() - task 0affcd87-79f5-ccb1-55ae-000000000497 12180 1727204081.95047: variable 'ansible_search_path' from source: unknown 12180 1727204081.95084: variable 'ansible_search_path' from source: unknown 12180 1727204081.95206: calling self._execute() 12180 1727204081.95344: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204081.95415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204081.95468: variable 'omit' from source: magic vars 12180 1727204081.96287: variable 'ansible_distribution_major_version' from source: facts 12180 1727204081.96368: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204081.96540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204081.96827: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204081.96881: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204081.96926: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204081.96973: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204081.97107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204081.97140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204081.97180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204081.97214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204081.97316: variable '__network_is_ostree' from source: set_fact 12180 1727204081.97328: Evaluated conditional (not __network_is_ostree is defined): False 12180 1727204081.97339: when evaluation is False, skipping this task 12180 1727204081.97346: _execute() done 12180 1727204081.97352: dumping result to json 12180 1727204081.97359: done dumping result, returning 12180 1727204081.97375: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-ccb1-55ae-000000000497] 12180 1727204081.97386: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000497 12180 1727204081.97499: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000497 12180 1727204081.97506: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12180 1727204081.97555: no more pending results, returning what we have 12180 1727204081.97558: results queue empty 12180 1727204081.97559: checking for any_errors_fatal 12180 1727204081.97569: done checking for any_errors_fatal 12180 1727204081.97570: checking for max_fail_percentage 12180 1727204081.97572: done checking for max_fail_percentage 12180 1727204081.97572: checking to see if all hosts have failed and the running result is not ok 12180 1727204081.97573: done checking to see if all hosts have failed 12180 1727204081.97574: getting the remaining hosts for this loop 12180 1727204081.97575: done getting the remaining hosts for this loop 12180 1727204081.97579: getting the next task for host managed-node1 12180 1727204081.97589: done getting next task for host managed-node1 12180 1727204081.97593: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12180 1727204081.97599: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204081.97619: getting variables 12180 1727204081.97620: in VariableManager get_vars() 12180 1727204081.97661: Calling all_inventory to load vars for managed-node1 12180 1727204081.97666: Calling groups_inventory to load vars for managed-node1 12180 1727204081.97668: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204081.97678: Calling all_plugins_play to load vars for managed-node1 12180 1727204081.97680: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204081.97682: Calling groups_plugins_play to load vars for managed-node1 12180 1727204081.99622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204082.01223: done with get_vars() 12180 1727204082.01252: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:54:42 -0400 (0:00:00.087) 0:00:29.428 ***** 12180 1727204082.01640: entering _queue_task() for managed-node1/service_facts 12180 1727204082.01977: worker is 1 (out of 1 available) 12180 1727204082.01991: exiting _queue_task() for managed-node1/service_facts 12180 1727204082.02005: done queuing things up, now waiting for results queue to drain 12180 1727204082.02007: waiting for pending results... 12180 1727204082.02312: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 12180 1727204082.02475: in run() - task 0affcd87-79f5-ccb1-55ae-000000000499 12180 1727204082.02488: variable 'ansible_search_path' from source: unknown 12180 1727204082.02492: variable 'ansible_search_path' from source: unknown 12180 1727204082.02528: calling self._execute() 12180 1727204082.02623: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204082.02629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204082.02639: variable 'omit' from source: magic vars 12180 1727204082.02997: variable 'ansible_distribution_major_version' from source: facts 12180 1727204082.03009: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204082.03015: variable 'omit' from source: magic vars 12180 1727204082.03091: variable 'omit' from source: magic vars 12180 1727204082.03133: variable 'omit' from source: magic vars 12180 1727204082.03171: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204082.03206: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204082.03235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204082.03252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204082.03261: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204082.03296: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204082.03299: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204082.03304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204082.03501: Set connection var ansible_pipelining to False 12180 1727204082.03505: Set connection var ansible_shell_type to sh 12180 1727204082.03511: Set connection var ansible_timeout to 10 12180 1727204082.03517: Set connection var ansible_connection to ssh 12180 1727204082.03524: Set connection var ansible_shell_executable to /bin/sh 12180 1727204082.03527: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204082.03562: variable 'ansible_shell_executable' from source: unknown 12180 1727204082.03567: variable 'ansible_connection' from source: unknown 12180 1727204082.03571: variable 'ansible_module_compression' from source: unknown 12180 1727204082.03573: variable 'ansible_shell_type' from source: unknown 12180 1727204082.03575: variable 'ansible_shell_executable' from source: unknown 12180 1727204082.03578: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204082.03580: variable 'ansible_pipelining' from source: unknown 12180 1727204082.03582: variable 'ansible_timeout' from source: unknown 12180 1727204082.03586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204082.03849: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204082.03859: variable 'omit' from source: magic vars 12180 1727204082.03864: starting attempt loop 12180 1727204082.03869: running the handler 12180 1727204082.03885: _low_level_execute_command(): starting 12180 1727204082.03900: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204082.04631: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204082.04639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204082.04652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204082.04674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204082.04711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204082.04720: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204082.04727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204082.04740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204082.04748: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204082.04755: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204082.04765: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204082.04781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204082.04792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204082.04800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204082.04807: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204082.04818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204082.04897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204082.04911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204082.04915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204082.05005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204082.06655: stdout chunk (state=3): >>>/root <<< 12180 1727204082.06762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204082.06860: stderr chunk (state=3): >>><<< 12180 1727204082.06863: stdout chunk (state=3): >>><<< 12180 1727204082.06888: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204082.06901: _low_level_execute_command(): starting 12180 1727204082.06906: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204082.068873-15017-220110183775614 `" && echo ansible-tmp-1727204082.068873-15017-220110183775614="` echo /root/.ansible/tmp/ansible-tmp-1727204082.068873-15017-220110183775614 `" ) && sleep 0' 12180 1727204082.09406: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204082.09525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204082.09570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204082.09574: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204082.09577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204082.09767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204082.09771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204082.09773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204082.09852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204082.11742: stdout chunk (state=3): >>>ansible-tmp-1727204082.068873-15017-220110183775614=/root/.ansible/tmp/ansible-tmp-1727204082.068873-15017-220110183775614 <<< 12180 1727204082.11852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204082.11937: stderr chunk (state=3): >>><<< 12180 1727204082.11941: stdout chunk (state=3): >>><<< 12180 1727204082.12173: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204082.068873-15017-220110183775614=/root/.ansible/tmp/ansible-tmp-1727204082.068873-15017-220110183775614 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204082.12176: variable 'ansible_module_compression' from source: unknown 12180 1727204082.12179: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 12180 1727204082.12181: variable 'ansible_facts' from source: unknown 12180 1727204082.12192: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204082.068873-15017-220110183775614/AnsiballZ_service_facts.py 12180 1727204082.12845: Sending initial data 12180 1727204082.12849: Sent initial data (161 bytes) 12180 1727204082.14244: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204082.14283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204082.14286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 12180 1727204082.14288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204082.14291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204082.14365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204082.14369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204082.14371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204082.14444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204082.16153: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204082.16204: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204082.16262: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpxw_q6aqq /root/.ansible/tmp/ansible-tmp-1727204082.068873-15017-220110183775614/AnsiballZ_service_facts.py <<< 12180 1727204082.16313: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204082.17741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204082.18013: stderr chunk (state=3): >>><<< 12180 1727204082.18017: stdout chunk (state=3): >>><<< 12180 1727204082.18019: done transferring module to remote 12180 1727204082.18021: _low_level_execute_command(): starting 12180 1727204082.18023: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204082.068873-15017-220110183775614/ /root/.ansible/tmp/ansible-tmp-1727204082.068873-15017-220110183775614/AnsiballZ_service_facts.py && sleep 0' 12180 1727204082.19461: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204082.19478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204082.19491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204082.19506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204082.19555: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204082.19674: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204082.19690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204082.19709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204082.19727: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204082.19739: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204082.19753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204082.19772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204082.19790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204082.19802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204082.19813: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204082.19826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204082.19906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204082.20002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204082.20018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204082.20215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204082.22009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204082.22013: stdout chunk (state=3): >>><<< 12180 1727204082.22015: stderr chunk (state=3): >>><<< 12180 1727204082.22111: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204082.22115: _low_level_execute_command(): starting 12180 1727204082.22118: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204082.068873-15017-220110183775614/AnsiballZ_service_facts.py && sleep 0' 12180 1727204082.23619: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204082.23775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204082.23792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204082.23812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204082.23856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204082.23875: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204082.23891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204082.23910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204082.23924: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204082.23936: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204082.23949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204082.23967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204082.23988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204082.24001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204082.24015: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204082.24029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204082.24222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204082.24247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204082.24268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204082.24366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204083.54511: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "stat<<< 12180 1727204083.54525: stdout chunk (state=3): >>>e": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "syst<<< 12180 1727204083.54620: stdout chunk (state=3): >>>emd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.s<<< 12180 1727204083.54646: stdout chunk (state=3): >>>ervice", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hiber<<< 12180 1727204083.54666: stdout chunk (state=3): >>>nate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12180 1727204083.55929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204083.55933: stdout chunk (state=3): >>><<< 12180 1727204083.55935: stderr chunk (state=3): >>><<< 12180 1727204083.56176: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204083.56748: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204082.068873-15017-220110183775614/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204083.56768: _low_level_execute_command(): starting 12180 1727204083.56790: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204082.068873-15017-220110183775614/ > /dev/null 2>&1 && sleep 0' 12180 1727204083.57627: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204083.57644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204083.57660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204083.57681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204083.57736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204083.57755: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204083.57772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204083.57790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204083.57804: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204083.57824: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204083.57838: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204083.57853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204083.57870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204083.57882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204083.57893: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204083.57907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204083.58018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204083.58062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204083.58083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204083.58177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204083.59947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204083.60054: stderr chunk (state=3): >>><<< 12180 1727204083.60068: stdout chunk (state=3): >>><<< 12180 1727204083.60175: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204083.60179: handler run complete 12180 1727204083.60375: variable 'ansible_facts' from source: unknown 12180 1727204083.60462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204083.60939: variable 'ansible_facts' from source: unknown 12180 1727204083.61084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204083.61288: attempt loop complete, returning result 12180 1727204083.61298: _execute() done 12180 1727204083.61304: dumping result to json 12180 1727204083.61374: done dumping result, returning 12180 1727204083.61387: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-ccb1-55ae-000000000499] 12180 1727204083.61397: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000499 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12180 1727204083.62205: no more pending results, returning what we have 12180 1727204083.62209: results queue empty 12180 1727204083.62210: checking for any_errors_fatal 12180 1727204083.62215: done checking for any_errors_fatal 12180 1727204083.62216: checking for max_fail_percentage 12180 1727204083.62218: done checking for max_fail_percentage 12180 1727204083.62219: checking to see if all hosts have failed and the running result is not ok 12180 1727204083.62219: done checking to see if all hosts have failed 12180 1727204083.62220: getting the remaining hosts for this loop 12180 1727204083.62221: done getting the remaining hosts for this loop 12180 1727204083.62225: getting the next task for host managed-node1 12180 1727204083.62234: done getting next task for host managed-node1 12180 1727204083.62238: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12180 1727204083.62245: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204083.62258: getting variables 12180 1727204083.62260: in VariableManager get_vars() 12180 1727204083.62318: Calling all_inventory to load vars for managed-node1 12180 1727204083.62321: Calling groups_inventory to load vars for managed-node1 12180 1727204083.62324: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204083.62341: Calling all_plugins_play to load vars for managed-node1 12180 1727204083.62343: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204083.62347: Calling groups_plugins_play to load vars for managed-node1 12180 1727204083.63773: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000499 12180 1727204083.63778: WORKER PROCESS EXITING 12180 1727204083.65111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204083.66339: done with get_vars() 12180 1727204083.66365: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:54:43 -0400 (0:00:01.648) 0:00:31.076 ***** 12180 1727204083.66443: entering _queue_task() for managed-node1/package_facts 12180 1727204083.66686: worker is 1 (out of 1 available) 12180 1727204083.66699: exiting _queue_task() for managed-node1/package_facts 12180 1727204083.66712: done queuing things up, now waiting for results queue to drain 12180 1727204083.66714: waiting for pending results... 12180 1727204083.66912: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 12180 1727204083.67288: in run() - task 0affcd87-79f5-ccb1-55ae-00000000049a 12180 1727204083.67291: variable 'ansible_search_path' from source: unknown 12180 1727204083.67293: variable 'ansible_search_path' from source: unknown 12180 1727204083.67296: calling self._execute() 12180 1727204083.67771: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204083.67776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204083.67779: variable 'omit' from source: magic vars 12180 1727204083.67904: variable 'ansible_distribution_major_version' from source: facts 12180 1727204083.67914: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204083.67972: variable 'omit' from source: magic vars 12180 1727204083.68007: variable 'omit' from source: magic vars 12180 1727204083.68041: variable 'omit' from source: magic vars 12180 1727204083.68088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204083.68119: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204083.68140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204083.68167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204083.68178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204083.68207: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204083.68211: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204083.68213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204083.68323: Set connection var ansible_pipelining to False 12180 1727204083.68326: Set connection var ansible_shell_type to sh 12180 1727204083.68334: Set connection var ansible_timeout to 10 12180 1727204083.68337: Set connection var ansible_connection to ssh 12180 1727204083.68343: Set connection var ansible_shell_executable to /bin/sh 12180 1727204083.68348: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204083.68383: variable 'ansible_shell_executable' from source: unknown 12180 1727204083.68386: variable 'ansible_connection' from source: unknown 12180 1727204083.68389: variable 'ansible_module_compression' from source: unknown 12180 1727204083.68392: variable 'ansible_shell_type' from source: unknown 12180 1727204083.68394: variable 'ansible_shell_executable' from source: unknown 12180 1727204083.68396: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204083.68401: variable 'ansible_pipelining' from source: unknown 12180 1727204083.68404: variable 'ansible_timeout' from source: unknown 12180 1727204083.68406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204083.68619: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204083.68629: variable 'omit' from source: magic vars 12180 1727204083.68635: starting attempt loop 12180 1727204083.68637: running the handler 12180 1727204083.68651: _low_level_execute_command(): starting 12180 1727204083.68659: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204083.69755: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204083.69905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204083.69912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204083.69978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204083.69982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204083.69995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204083.70007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204083.70090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204083.70104: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204083.70109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204083.70266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204083.71935: stdout chunk (state=3): >>>/root <<< 12180 1727204083.72083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204083.72087: stderr chunk (state=3): >>><<< 12180 1727204083.72089: stdout chunk (state=3): >>><<< 12180 1727204083.72093: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204083.72096: _low_level_execute_command(): starting 12180 1727204083.72098: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204083.720671-15075-115690677774592 `" && echo ansible-tmp-1727204083.720671-15075-115690677774592="` echo /root/.ansible/tmp/ansible-tmp-1727204083.720671-15075-115690677774592 `" ) && sleep 0' 12180 1727204083.73973: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204083.74085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204083.74093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204083.74107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204083.74148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204083.74154: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204083.74166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204083.74182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204083.74256: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204083.74263: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204083.74272: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204083.74282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204083.74294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204083.74302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204083.74309: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204083.74320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204083.74394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204083.74415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204083.74432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204083.74517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204083.76432: stdout chunk (state=3): >>>ansible-tmp-1727204083.720671-15075-115690677774592=/root/.ansible/tmp/ansible-tmp-1727204083.720671-15075-115690677774592 <<< 12180 1727204083.76622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204083.76626: stdout chunk (state=3): >>><<< 12180 1727204083.76635: stderr chunk (state=3): >>><<< 12180 1727204083.76650: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204083.720671-15075-115690677774592=/root/.ansible/tmp/ansible-tmp-1727204083.720671-15075-115690677774592 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204083.76701: variable 'ansible_module_compression' from source: unknown 12180 1727204083.76751: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 12180 1727204083.76814: variable 'ansible_facts' from source: unknown 12180 1727204083.77012: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204083.720671-15075-115690677774592/AnsiballZ_package_facts.py 12180 1727204083.77945: Sending initial data 12180 1727204083.77949: Sent initial data (161 bytes) 12180 1727204083.80278: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204083.80288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204083.80298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204083.80314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204083.80356: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204083.80363: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204083.80376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204083.80390: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204083.80398: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204083.80404: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204083.80412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204083.80422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204083.80436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204083.80439: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204083.80446: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204083.80455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204083.80531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204083.80549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204083.80561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204083.80650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204083.82353: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204083.82406: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204083.82454: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpijkh0vpb /root/.ansible/tmp/ansible-tmp-1727204083.720671-15075-115690677774592/AnsiballZ_package_facts.py <<< 12180 1727204083.82502: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204083.85303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204083.85392: stderr chunk (state=3): >>><<< 12180 1727204083.85396: stdout chunk (state=3): >>><<< 12180 1727204083.85418: done transferring module to remote 12180 1727204083.85429: _low_level_execute_command(): starting 12180 1727204083.85437: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204083.720671-15075-115690677774592/ /root/.ansible/tmp/ansible-tmp-1727204083.720671-15075-115690677774592/AnsiballZ_package_facts.py && sleep 0' 12180 1727204083.86113: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204083.86123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204083.86258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204083.86262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204083.86272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204083.86276: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204083.86278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204083.86280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204083.86283: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204083.86285: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204083.86287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204083.86411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204083.86414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204083.86417: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204083.86419: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204083.86421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204083.86437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204083.86451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204083.86463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204083.86572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204083.88374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204083.88378: stdout chunk (state=3): >>><<< 12180 1727204083.88383: stderr chunk (state=3): >>><<< 12180 1727204083.88402: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204083.88406: _low_level_execute_command(): starting 12180 1727204083.88411: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204083.720671-15075-115690677774592/AnsiballZ_package_facts.py && sleep 0' 12180 1727204083.89070: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204083.89080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204083.89091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204083.89105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204083.89147: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204083.89151: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204083.89161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204083.89177: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204083.89185: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204083.89192: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204083.89200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204083.89210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204083.89223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204083.89232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204083.89235: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204083.89246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204083.89325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204083.89344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204083.89358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204083.89456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204084.35251: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{<<< 12180 1727204084.35286: stdout chunk (state=3): >>>"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4"<<< 12180 1727204084.35317: stdout chunk (state=3): >>>, "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x<<< 12180 1727204084.35349: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 12180 1727204084.35395: stdout chunk (state=3): >>>libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "rel<<< 12180 1727204084.35410: stdout chunk (state=3): >>>ease": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], <<< 12180 1727204084.35416: stdout chunk (state=3): >>>"slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "r<<< 12180 1727204084.35422: stdout chunk (state=3): >>>elease": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles"<<< 12180 1727204084.35436: stdout chunk (state=3): >>>: [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pe<<< 12180 1727204084.35441: stdout chunk (state=3): >>>rl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}],<<< 12180 1727204084.35470: stdout chunk (state=3): >>> "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "<<< 12180 1727204084.35474: stdout chunk (state=3): >>>0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "s<<< 12180 1727204084.35479: stdout chunk (state=3): >>>ource": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el<<< 12180 1727204084.35482: stdout chunk (state=3): >>>9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12180 1727204084.37045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204084.37049: stdout chunk (state=3): >>><<< 12180 1727204084.37055: stderr chunk (state=3): >>><<< 12180 1727204084.37103: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204084.39752: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204083.720671-15075-115690677774592/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204084.39778: _low_level_execute_command(): starting 12180 1727204084.39782: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204083.720671-15075-115690677774592/ > /dev/null 2>&1 && sleep 0' 12180 1727204084.40509: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204084.40516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204084.40527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204084.40553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204084.40596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204084.40603: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204084.40618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204084.40625: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204084.40637: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204084.40645: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204084.40660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204084.40672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204084.40684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204084.40691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204084.40698: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204084.40708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204084.40789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204084.40810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204084.40823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204084.40910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204084.42719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204084.42805: stderr chunk (state=3): >>><<< 12180 1727204084.42809: stdout chunk (state=3): >>><<< 12180 1727204084.42832: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204084.42841: handler run complete 12180 1727204084.43766: variable 'ansible_facts' from source: unknown 12180 1727204084.44280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204084.46431: variable 'ansible_facts' from source: unknown 12180 1727204084.47496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204084.48425: attempt loop complete, returning result 12180 1727204084.48443: _execute() done 12180 1727204084.48446: dumping result to json 12180 1727204084.48663: done dumping result, returning 12180 1727204084.48673: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-ccb1-55ae-00000000049a] 12180 1727204084.48679: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000049a ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12180 1727204084.51993: no more pending results, returning what we have 12180 1727204084.51996: results queue empty 12180 1727204084.51998: checking for any_errors_fatal 12180 1727204084.52004: done checking for any_errors_fatal 12180 1727204084.52005: checking for max_fail_percentage 12180 1727204084.52007: done checking for max_fail_percentage 12180 1727204084.52008: checking to see if all hosts have failed and the running result is not ok 12180 1727204084.52010: done checking to see if all hosts have failed 12180 1727204084.52011: getting the remaining hosts for this loop 12180 1727204084.52012: done getting the remaining hosts for this loop 12180 1727204084.52016: getting the next task for host managed-node1 12180 1727204084.52025: done getting next task for host managed-node1 12180 1727204084.52029: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12180 1727204084.52034: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204084.52048: getting variables 12180 1727204084.52050: in VariableManager get_vars() 12180 1727204084.52091: Calling all_inventory to load vars for managed-node1 12180 1727204084.52094: Calling groups_inventory to load vars for managed-node1 12180 1727204084.52097: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204084.52106: Calling all_plugins_play to load vars for managed-node1 12180 1727204084.52108: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204084.52111: Calling groups_plugins_play to load vars for managed-node1 12180 1727204084.53672: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000049a 12180 1727204084.53680: WORKER PROCESS EXITING 12180 1727204084.54574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204084.56186: done with get_vars() 12180 1727204084.56215: done getting variables 12180 1727204084.56279: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.898) 0:00:31.975 ***** 12180 1727204084.56318: entering _queue_task() for managed-node1/debug 12180 1727204084.56650: worker is 1 (out of 1 available) 12180 1727204084.56663: exiting _queue_task() for managed-node1/debug 12180 1727204084.56676: done queuing things up, now waiting for results queue to drain 12180 1727204084.56678: waiting for pending results... 12180 1727204084.56975: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 12180 1727204084.57142: in run() - task 0affcd87-79f5-ccb1-55ae-00000000007e 12180 1727204084.57165: variable 'ansible_search_path' from source: unknown 12180 1727204084.57174: variable 'ansible_search_path' from source: unknown 12180 1727204084.57218: calling self._execute() 12180 1727204084.57318: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204084.57329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204084.57347: variable 'omit' from source: magic vars 12180 1727204084.57723: variable 'ansible_distribution_major_version' from source: facts 12180 1727204084.57742: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204084.57752: variable 'omit' from source: magic vars 12180 1727204084.57822: variable 'omit' from source: magic vars 12180 1727204084.57913: variable 'network_provider' from source: set_fact 12180 1727204084.57932: variable 'omit' from source: magic vars 12180 1727204084.57980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204084.58021: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204084.58047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204084.58070: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204084.58085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204084.58120: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204084.58127: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204084.58134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204084.58237: Set connection var ansible_pipelining to False 12180 1727204084.58245: Set connection var ansible_shell_type to sh 12180 1727204084.58255: Set connection var ansible_timeout to 10 12180 1727204084.58265: Set connection var ansible_connection to ssh 12180 1727204084.58275: Set connection var ansible_shell_executable to /bin/sh 12180 1727204084.58285: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204084.58319: variable 'ansible_shell_executable' from source: unknown 12180 1727204084.58326: variable 'ansible_connection' from source: unknown 12180 1727204084.58332: variable 'ansible_module_compression' from source: unknown 12180 1727204084.58339: variable 'ansible_shell_type' from source: unknown 12180 1727204084.58345: variable 'ansible_shell_executable' from source: unknown 12180 1727204084.58350: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204084.58357: variable 'ansible_pipelining' from source: unknown 12180 1727204084.58365: variable 'ansible_timeout' from source: unknown 12180 1727204084.58373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204084.58514: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204084.58533: variable 'omit' from source: magic vars 12180 1727204084.58543: starting attempt loop 12180 1727204084.58549: running the handler 12180 1727204084.58597: handler run complete 12180 1727204084.58614: attempt loop complete, returning result 12180 1727204084.58621: _execute() done 12180 1727204084.58626: dumping result to json 12180 1727204084.58637: done dumping result, returning 12180 1727204084.58652: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-ccb1-55ae-00000000007e] 12180 1727204084.58661: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000007e ok: [managed-node1] => {} MSG: Using network provider: nm 12180 1727204084.58821: no more pending results, returning what we have 12180 1727204084.58825: results queue empty 12180 1727204084.58826: checking for any_errors_fatal 12180 1727204084.58837: done checking for any_errors_fatal 12180 1727204084.58838: checking for max_fail_percentage 12180 1727204084.58840: done checking for max_fail_percentage 12180 1727204084.58841: checking to see if all hosts have failed and the running result is not ok 12180 1727204084.58842: done checking to see if all hosts have failed 12180 1727204084.58842: getting the remaining hosts for this loop 12180 1727204084.58844: done getting the remaining hosts for this loop 12180 1727204084.58848: getting the next task for host managed-node1 12180 1727204084.58857: done getting next task for host managed-node1 12180 1727204084.58861: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12180 1727204084.58867: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204084.58880: getting variables 12180 1727204084.58881: in VariableManager get_vars() 12180 1727204084.58923: Calling all_inventory to load vars for managed-node1 12180 1727204084.58926: Calling groups_inventory to load vars for managed-node1 12180 1727204084.58928: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204084.58940: Calling all_plugins_play to load vars for managed-node1 12180 1727204084.58943: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204084.58946: Calling groups_plugins_play to load vars for managed-node1 12180 1727204084.60176: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000007e 12180 1727204084.60180: WORKER PROCESS EXITING 12180 1727204084.60626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204084.62242: done with get_vars() 12180 1727204084.62549: done getting variables 12180 1727204084.62612: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.063) 0:00:32.038 ***** 12180 1727204084.62650: entering _queue_task() for managed-node1/fail 12180 1727204084.62958: worker is 1 (out of 1 available) 12180 1727204084.62972: exiting _queue_task() for managed-node1/fail 12180 1727204084.62984: done queuing things up, now waiting for results queue to drain 12180 1727204084.62986: waiting for pending results... 12180 1727204084.63705: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12180 1727204084.63885: in run() - task 0affcd87-79f5-ccb1-55ae-00000000007f 12180 1727204084.63909: variable 'ansible_search_path' from source: unknown 12180 1727204084.63918: variable 'ansible_search_path' from source: unknown 12180 1727204084.63970: calling self._execute() 12180 1727204084.64083: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204084.64095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204084.64115: variable 'omit' from source: magic vars 12180 1727204084.65655: variable 'ansible_distribution_major_version' from source: facts 12180 1727204084.65677: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204084.65808: variable 'network_state' from source: role '' defaults 12180 1727204084.65823: Evaluated conditional (network_state != {}): False 12180 1727204084.65831: when evaluation is False, skipping this task 12180 1727204084.65838: _execute() done 12180 1727204084.65845: dumping result to json 12180 1727204084.65851: done dumping result, returning 12180 1727204084.65862: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-ccb1-55ae-00000000007f] 12180 1727204084.65876: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000007f 12180 1727204084.65992: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000007f 12180 1727204084.65999: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12180 1727204084.66051: no more pending results, returning what we have 12180 1727204084.66055: results queue empty 12180 1727204084.66056: checking for any_errors_fatal 12180 1727204084.66062: done checking for any_errors_fatal 12180 1727204084.66063: checking for max_fail_percentage 12180 1727204084.66067: done checking for max_fail_percentage 12180 1727204084.66068: checking to see if all hosts have failed and the running result is not ok 12180 1727204084.66069: done checking to see if all hosts have failed 12180 1727204084.66069: getting the remaining hosts for this loop 12180 1727204084.66071: done getting the remaining hosts for this loop 12180 1727204084.66075: getting the next task for host managed-node1 12180 1727204084.66084: done getting next task for host managed-node1 12180 1727204084.66089: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12180 1727204084.66093: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204084.66114: getting variables 12180 1727204084.66116: in VariableManager get_vars() 12180 1727204084.66159: Calling all_inventory to load vars for managed-node1 12180 1727204084.66162: Calling groups_inventory to load vars for managed-node1 12180 1727204084.66166: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204084.66178: Calling all_plugins_play to load vars for managed-node1 12180 1727204084.66181: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204084.66183: Calling groups_plugins_play to load vars for managed-node1 12180 1727204084.72183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204084.75045: done with get_vars() 12180 1727204084.75080: done getting variables 12180 1727204084.75129: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.125) 0:00:32.163 ***** 12180 1727204084.75161: entering _queue_task() for managed-node1/fail 12180 1727204084.75480: worker is 1 (out of 1 available) 12180 1727204084.75492: exiting _queue_task() for managed-node1/fail 12180 1727204084.75504: done queuing things up, now waiting for results queue to drain 12180 1727204084.75506: waiting for pending results... 12180 1727204084.77262: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12180 1727204084.77587: in run() - task 0affcd87-79f5-ccb1-55ae-000000000080 12180 1727204084.77606: variable 'ansible_search_path' from source: unknown 12180 1727204084.77618: variable 'ansible_search_path' from source: unknown 12180 1727204084.77665: calling self._execute() 12180 1727204084.77769: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204084.77784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204084.77797: variable 'omit' from source: magic vars 12180 1727204084.78188: variable 'ansible_distribution_major_version' from source: facts 12180 1727204084.78213: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204084.78351: variable 'network_state' from source: role '' defaults 12180 1727204084.78372: Evaluated conditional (network_state != {}): False 12180 1727204084.78382: when evaluation is False, skipping this task 12180 1727204084.78389: _execute() done 12180 1727204084.78396: dumping result to json 12180 1727204084.78403: done dumping result, returning 12180 1727204084.78415: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-ccb1-55ae-000000000080] 12180 1727204084.78430: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000080 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12180 1727204084.78605: no more pending results, returning what we have 12180 1727204084.78609: results queue empty 12180 1727204084.78610: checking for any_errors_fatal 12180 1727204084.78619: done checking for any_errors_fatal 12180 1727204084.78620: checking for max_fail_percentage 12180 1727204084.78622: done checking for max_fail_percentage 12180 1727204084.78623: checking to see if all hosts have failed and the running result is not ok 12180 1727204084.78624: done checking to see if all hosts have failed 12180 1727204084.78624: getting the remaining hosts for this loop 12180 1727204084.78626: done getting the remaining hosts for this loop 12180 1727204084.78630: getting the next task for host managed-node1 12180 1727204084.78639: done getting next task for host managed-node1 12180 1727204084.78643: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12180 1727204084.78648: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204084.78673: getting variables 12180 1727204084.78676: in VariableManager get_vars() 12180 1727204084.78720: Calling all_inventory to load vars for managed-node1 12180 1727204084.78723: Calling groups_inventory to load vars for managed-node1 12180 1727204084.78726: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204084.78738: Calling all_plugins_play to load vars for managed-node1 12180 1727204084.78741: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204084.78744: Calling groups_plugins_play to load vars for managed-node1 12180 1727204084.79704: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000080 12180 1727204084.79708: WORKER PROCESS EXITING 12180 1727204084.80513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204084.83192: done with get_vars() 12180 1727204084.83223: done getting variables 12180 1727204084.83286: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.081) 0:00:32.245 ***** 12180 1727204084.83327: entering _queue_task() for managed-node1/fail 12180 1727204084.84792: worker is 1 (out of 1 available) 12180 1727204084.84805: exiting _queue_task() for managed-node1/fail 12180 1727204084.84819: done queuing things up, now waiting for results queue to drain 12180 1727204084.84821: waiting for pending results... 12180 1727204084.86104: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12180 1727204084.86278: in run() - task 0affcd87-79f5-ccb1-55ae-000000000081 12180 1727204084.86418: variable 'ansible_search_path' from source: unknown 12180 1727204084.86514: variable 'ansible_search_path' from source: unknown 12180 1727204084.86560: calling self._execute() 12180 1727204084.86756: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204084.86861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204084.86887: variable 'omit' from source: magic vars 12180 1727204084.87716: variable 'ansible_distribution_major_version' from source: facts 12180 1727204084.87786: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204084.88075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204084.91960: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204084.92059: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204084.92106: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204084.92151: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204084.92184: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204084.92274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204084.92309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204084.92347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204084.92395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204084.92417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204084.92524: variable 'ansible_distribution_major_version' from source: facts 12180 1727204084.92544: Evaluated conditional (ansible_distribution_major_version | int > 9): False 12180 1727204084.92550: when evaluation is False, skipping this task 12180 1727204084.92557: _execute() done 12180 1727204084.92569: dumping result to json 12180 1727204084.92575: done dumping result, returning 12180 1727204084.92585: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-ccb1-55ae-000000000081] 12180 1727204084.92593: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000081 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 12180 1727204084.92735: no more pending results, returning what we have 12180 1727204084.92739: results queue empty 12180 1727204084.92740: checking for any_errors_fatal 12180 1727204084.92745: done checking for any_errors_fatal 12180 1727204084.92746: checking for max_fail_percentage 12180 1727204084.92748: done checking for max_fail_percentage 12180 1727204084.92749: checking to see if all hosts have failed and the running result is not ok 12180 1727204084.92750: done checking to see if all hosts have failed 12180 1727204084.92751: getting the remaining hosts for this loop 12180 1727204084.92752: done getting the remaining hosts for this loop 12180 1727204084.92756: getting the next task for host managed-node1 12180 1727204084.92771: done getting next task for host managed-node1 12180 1727204084.92776: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12180 1727204084.92780: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204084.92799: getting variables 12180 1727204084.92802: in VariableManager get_vars() 12180 1727204084.92847: Calling all_inventory to load vars for managed-node1 12180 1727204084.92851: Calling groups_inventory to load vars for managed-node1 12180 1727204084.92853: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204084.92867: Calling all_plugins_play to load vars for managed-node1 12180 1727204084.92870: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204084.92874: Calling groups_plugins_play to load vars for managed-node1 12180 1727204084.93887: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000081 12180 1727204084.93891: WORKER PROCESS EXITING 12180 1727204084.95374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204084.97383: done with get_vars() 12180 1727204084.97410: done getting variables 12180 1727204084.97473: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:44 -0400 (0:00:00.141) 0:00:32.387 ***** 12180 1727204084.97507: entering _queue_task() for managed-node1/dnf 12180 1727204084.97847: worker is 1 (out of 1 available) 12180 1727204084.97859: exiting _queue_task() for managed-node1/dnf 12180 1727204084.97872: done queuing things up, now waiting for results queue to drain 12180 1727204084.97874: waiting for pending results... 12180 1727204084.98169: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12180 1727204084.98334: in run() - task 0affcd87-79f5-ccb1-55ae-000000000082 12180 1727204084.98353: variable 'ansible_search_path' from source: unknown 12180 1727204084.98361: variable 'ansible_search_path' from source: unknown 12180 1727204084.98402: calling self._execute() 12180 1727204084.98505: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204084.98516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204084.98536: variable 'omit' from source: magic vars 12180 1727204084.99040: variable 'ansible_distribution_major_version' from source: facts 12180 1727204084.99057: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204084.99347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204085.02270: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204085.02359: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204085.02404: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204085.02452: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204085.02486: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204085.02578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.02611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.02646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.02697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.02718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.02855: variable 'ansible_distribution' from source: facts 12180 1727204085.02866: variable 'ansible_distribution_major_version' from source: facts 12180 1727204085.02888: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12180 1727204085.03020: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204085.03162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.03197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.03233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.03279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.03299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.03352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.03452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.03485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.03578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.03661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.03708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.03887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.03916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.03964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.03988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.04369: variable 'network_connections' from source: task vars 12180 1727204085.04384: variable 'port2_profile' from source: play vars 12180 1727204085.04477: variable 'port2_profile' from source: play vars 12180 1727204085.04525: variable 'port1_profile' from source: play vars 12180 1727204085.04629: variable 'port1_profile' from source: play vars 12180 1727204085.04635: variable 'controller_profile' from source: play vars 12180 1727204085.04714: variable 'controller_profile' from source: play vars 12180 1727204085.04788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204085.04999: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204085.05062: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204085.05086: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204085.05105: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204085.05143: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204085.05162: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204085.05192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.05212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204085.05250: variable '__network_team_connections_defined' from source: role '' defaults 12180 1727204085.05412: variable 'network_connections' from source: task vars 12180 1727204085.05415: variable 'port2_profile' from source: play vars 12180 1727204085.05461: variable 'port2_profile' from source: play vars 12180 1727204085.05470: variable 'port1_profile' from source: play vars 12180 1727204085.05514: variable 'port1_profile' from source: play vars 12180 1727204085.05517: variable 'controller_profile' from source: play vars 12180 1727204085.05563: variable 'controller_profile' from source: play vars 12180 1727204085.05586: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12180 1727204085.05590: when evaluation is False, skipping this task 12180 1727204085.05592: _execute() done 12180 1727204085.05595: dumping result to json 12180 1727204085.05597: done dumping result, returning 12180 1727204085.05603: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-ccb1-55ae-000000000082] 12180 1727204085.05609: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000082 12180 1727204085.05706: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000082 12180 1727204085.05708: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12180 1727204085.05775: no more pending results, returning what we have 12180 1727204085.05779: results queue empty 12180 1727204085.05780: checking for any_errors_fatal 12180 1727204085.05787: done checking for any_errors_fatal 12180 1727204085.05787: checking for max_fail_percentage 12180 1727204085.05789: done checking for max_fail_percentage 12180 1727204085.05790: checking to see if all hosts have failed and the running result is not ok 12180 1727204085.05791: done checking to see if all hosts have failed 12180 1727204085.05791: getting the remaining hosts for this loop 12180 1727204085.05793: done getting the remaining hosts for this loop 12180 1727204085.05797: getting the next task for host managed-node1 12180 1727204085.05803: done getting next task for host managed-node1 12180 1727204085.05808: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12180 1727204085.05811: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204085.05832: getting variables 12180 1727204085.05834: in VariableManager get_vars() 12180 1727204085.05873: Calling all_inventory to load vars for managed-node1 12180 1727204085.05876: Calling groups_inventory to load vars for managed-node1 12180 1727204085.05878: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204085.05887: Calling all_plugins_play to load vars for managed-node1 12180 1727204085.05889: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204085.05892: Calling groups_plugins_play to load vars for managed-node1 12180 1727204085.07927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204085.09602: done with get_vars() 12180 1727204085.09626: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12180 1727204085.09709: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.122) 0:00:32.509 ***** 12180 1727204085.09748: entering _queue_task() for managed-node1/yum 12180 1727204085.10076: worker is 1 (out of 1 available) 12180 1727204085.10090: exiting _queue_task() for managed-node1/yum 12180 1727204085.10103: done queuing things up, now waiting for results queue to drain 12180 1727204085.10105: waiting for pending results... 12180 1727204085.10413: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12180 1727204085.10588: in run() - task 0affcd87-79f5-ccb1-55ae-000000000083 12180 1727204085.10607: variable 'ansible_search_path' from source: unknown 12180 1727204085.10616: variable 'ansible_search_path' from source: unknown 12180 1727204085.10935: calling self._execute() 12180 1727204085.11046: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204085.11058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204085.11076: variable 'omit' from source: magic vars 12180 1727204085.11496: variable 'ansible_distribution_major_version' from source: facts 12180 1727204085.11516: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204085.11710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204085.14379: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204085.14478: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204085.14534: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204085.14581: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204085.14614: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204085.14705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.14749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.14786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.14838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.14862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.14974: variable 'ansible_distribution_major_version' from source: facts 12180 1727204085.14996: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12180 1727204085.15003: when evaluation is False, skipping this task 12180 1727204085.15010: _execute() done 12180 1727204085.15017: dumping result to json 12180 1727204085.15024: done dumping result, returning 12180 1727204085.15039: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-ccb1-55ae-000000000083] 12180 1727204085.15050: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000083 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12180 1727204085.15227: no more pending results, returning what we have 12180 1727204085.15234: results queue empty 12180 1727204085.15236: checking for any_errors_fatal 12180 1727204085.15244: done checking for any_errors_fatal 12180 1727204085.15245: checking for max_fail_percentage 12180 1727204085.15246: done checking for max_fail_percentage 12180 1727204085.15247: checking to see if all hosts have failed and the running result is not ok 12180 1727204085.15248: done checking to see if all hosts have failed 12180 1727204085.15249: getting the remaining hosts for this loop 12180 1727204085.15250: done getting the remaining hosts for this loop 12180 1727204085.15254: getting the next task for host managed-node1 12180 1727204085.15261: done getting next task for host managed-node1 12180 1727204085.15268: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12180 1727204085.15272: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204085.15292: getting variables 12180 1727204085.15294: in VariableManager get_vars() 12180 1727204085.15344: Calling all_inventory to load vars for managed-node1 12180 1727204085.15347: Calling groups_inventory to load vars for managed-node1 12180 1727204085.15350: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204085.15362: Calling all_plugins_play to load vars for managed-node1 12180 1727204085.15367: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204085.15370: Calling groups_plugins_play to load vars for managed-node1 12180 1727204085.16486: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000083 12180 1727204085.16489: WORKER PROCESS EXITING 12180 1727204085.17158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204085.18903: done with get_vars() 12180 1727204085.18938: done getting variables 12180 1727204085.19005: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.092) 0:00:32.602 ***** 12180 1727204085.19046: entering _queue_task() for managed-node1/fail 12180 1727204085.19391: worker is 1 (out of 1 available) 12180 1727204085.19404: exiting _queue_task() for managed-node1/fail 12180 1727204085.19416: done queuing things up, now waiting for results queue to drain 12180 1727204085.19418: waiting for pending results... 12180 1727204085.19734: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12180 1727204085.19893: in run() - task 0affcd87-79f5-ccb1-55ae-000000000084 12180 1727204085.19911: variable 'ansible_search_path' from source: unknown 12180 1727204085.19919: variable 'ansible_search_path' from source: unknown 12180 1727204085.19963: calling self._execute() 12180 1727204085.20069: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204085.20085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204085.20100: variable 'omit' from source: magic vars 12180 1727204085.20486: variable 'ansible_distribution_major_version' from source: facts 12180 1727204085.20505: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204085.20638: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204085.20853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204085.23284: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204085.23699: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204085.23744: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204085.23788: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204085.23819: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204085.23905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.23938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.23971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.24023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.24045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.24099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.24127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.24158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.24208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.24228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.24278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.24308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.24342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.24387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.24406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.24599: variable 'network_connections' from source: task vars 12180 1727204085.24615: variable 'port2_profile' from source: play vars 12180 1727204085.24695: variable 'port2_profile' from source: play vars 12180 1727204085.24710: variable 'port1_profile' from source: play vars 12180 1727204085.24783: variable 'port1_profile' from source: play vars 12180 1727204085.24797: variable 'controller_profile' from source: play vars 12180 1727204085.24867: variable 'controller_profile' from source: play vars 12180 1727204085.24944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204085.25125: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204085.25173: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204085.25209: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204085.25258: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204085.25311: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204085.25340: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204085.25372: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.25407: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204085.25463: variable '__network_team_connections_defined' from source: role '' defaults 12180 1727204085.25723: variable 'network_connections' from source: task vars 12180 1727204085.25737: variable 'port2_profile' from source: play vars 12180 1727204085.25802: variable 'port2_profile' from source: play vars 12180 1727204085.25817: variable 'port1_profile' from source: play vars 12180 1727204085.25891: variable 'port1_profile' from source: play vars 12180 1727204085.25902: variable 'controller_profile' from source: play vars 12180 1727204085.25987: variable 'controller_profile' from source: play vars 12180 1727204085.26005: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12180 1727204085.26017: when evaluation is False, skipping this task 12180 1727204085.26020: _execute() done 12180 1727204085.26022: dumping result to json 12180 1727204085.26025: done dumping result, returning 12180 1727204085.26027: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-ccb1-55ae-000000000084] 12180 1727204085.26029: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000084 12180 1727204085.26136: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000084 12180 1727204085.26139: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12180 1727204085.26200: no more pending results, returning what we have 12180 1727204085.26204: results queue empty 12180 1727204085.26205: checking for any_errors_fatal 12180 1727204085.26211: done checking for any_errors_fatal 12180 1727204085.26212: checking for max_fail_percentage 12180 1727204085.26214: done checking for max_fail_percentage 12180 1727204085.26214: checking to see if all hosts have failed and the running result is not ok 12180 1727204085.26215: done checking to see if all hosts have failed 12180 1727204085.26216: getting the remaining hosts for this loop 12180 1727204085.26217: done getting the remaining hosts for this loop 12180 1727204085.26220: getting the next task for host managed-node1 12180 1727204085.26228: done getting next task for host managed-node1 12180 1727204085.26232: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12180 1727204085.26235: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204085.26255: getting variables 12180 1727204085.26257: in VariableManager get_vars() 12180 1727204085.26304: Calling all_inventory to load vars for managed-node1 12180 1727204085.26307: Calling groups_inventory to load vars for managed-node1 12180 1727204085.26309: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204085.26318: Calling all_plugins_play to load vars for managed-node1 12180 1727204085.26319: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204085.26322: Calling groups_plugins_play to load vars for managed-node1 12180 1727204085.27539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204085.29690: done with get_vars() 12180 1727204085.29722: done getting variables 12180 1727204085.29795: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.107) 0:00:32.710 ***** 12180 1727204085.29835: entering _queue_task() for managed-node1/package 12180 1727204085.30217: worker is 1 (out of 1 available) 12180 1727204085.30250: exiting _queue_task() for managed-node1/package 12180 1727204085.30266: done queuing things up, now waiting for results queue to drain 12180 1727204085.30268: waiting for pending results... 12180 1727204085.30482: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 12180 1727204085.30582: in run() - task 0affcd87-79f5-ccb1-55ae-000000000085 12180 1727204085.30592: variable 'ansible_search_path' from source: unknown 12180 1727204085.30596: variable 'ansible_search_path' from source: unknown 12180 1727204085.30625: calling self._execute() 12180 1727204085.30706: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204085.30709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204085.30717: variable 'omit' from source: magic vars 12180 1727204085.31478: variable 'ansible_distribution_major_version' from source: facts 12180 1727204085.31481: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204085.31484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204085.31487: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204085.31771: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204085.31776: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204085.31778: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204085.31780: variable 'network_packages' from source: role '' defaults 12180 1727204085.31835: variable '__network_provider_setup' from source: role '' defaults 12180 1727204085.31842: variable '__network_service_name_default_nm' from source: role '' defaults 12180 1727204085.31902: variable '__network_service_name_default_nm' from source: role '' defaults 12180 1727204085.31950: variable '__network_packages_default_nm' from source: role '' defaults 12180 1727204085.32193: variable '__network_packages_default_nm' from source: role '' defaults 12180 1727204085.32372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204085.35520: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204085.35588: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204085.35619: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204085.35648: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204085.35674: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204085.35762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.35789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.35811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.35850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.35867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.35915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.35937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.35963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.36004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.36018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.36249: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12180 1727204085.36368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.36392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.36416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.36456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.36471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.36567: variable 'ansible_python' from source: facts 12180 1727204085.36592: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12180 1727204085.36679: variable '__network_wpa_supplicant_required' from source: role '' defaults 12180 1727204085.36758: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12180 1727204085.36890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.36910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.36935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.36974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.36988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.37036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.37058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.37084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.37123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.37137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.37286: variable 'network_connections' from source: task vars 12180 1727204085.37292: variable 'port2_profile' from source: play vars 12180 1727204085.37395: variable 'port2_profile' from source: play vars 12180 1727204085.37406: variable 'port1_profile' from source: play vars 12180 1727204085.37508: variable 'port1_profile' from source: play vars 12180 1727204085.37516: variable 'controller_profile' from source: play vars 12180 1727204085.37969: variable 'controller_profile' from source: play vars 12180 1727204085.37973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204085.37976: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204085.37979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.37981: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204085.37983: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204085.38113: variable 'network_connections' from source: task vars 12180 1727204085.38117: variable 'port2_profile' from source: play vars 12180 1727204085.38225: variable 'port2_profile' from source: play vars 12180 1727204085.38235: variable 'port1_profile' from source: play vars 12180 1727204085.38336: variable 'port1_profile' from source: play vars 12180 1727204085.38346: variable 'controller_profile' from source: play vars 12180 1727204085.38441: variable 'controller_profile' from source: play vars 12180 1727204085.38476: variable '__network_packages_default_wireless' from source: role '' defaults 12180 1727204085.38554: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204085.38838: variable 'network_connections' from source: task vars 12180 1727204085.38842: variable 'port2_profile' from source: play vars 12180 1727204085.38910: variable 'port2_profile' from source: play vars 12180 1727204085.38916: variable 'port1_profile' from source: play vars 12180 1727204085.38980: variable 'port1_profile' from source: play vars 12180 1727204085.38986: variable 'controller_profile' from source: play vars 12180 1727204085.39043: variable 'controller_profile' from source: play vars 12180 1727204085.39068: variable '__network_packages_default_team' from source: role '' defaults 12180 1727204085.39145: variable '__network_team_connections_defined' from source: role '' defaults 12180 1727204085.39457: variable 'network_connections' from source: task vars 12180 1727204085.39461: variable 'port2_profile' from source: play vars 12180 1727204085.39525: variable 'port2_profile' from source: play vars 12180 1727204085.39535: variable 'port1_profile' from source: play vars 12180 1727204085.39599: variable 'port1_profile' from source: play vars 12180 1727204085.39607: variable 'controller_profile' from source: play vars 12180 1727204085.39675: variable 'controller_profile' from source: play vars 12180 1727204085.39737: variable '__network_service_name_default_initscripts' from source: role '' defaults 12180 1727204085.39798: variable '__network_service_name_default_initscripts' from source: role '' defaults 12180 1727204085.39804: variable '__network_packages_default_initscripts' from source: role '' defaults 12180 1727204085.39862: variable '__network_packages_default_initscripts' from source: role '' defaults 12180 1727204085.40081: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12180 1727204085.40604: variable 'network_connections' from source: task vars 12180 1727204085.40607: variable 'port2_profile' from source: play vars 12180 1727204085.40670: variable 'port2_profile' from source: play vars 12180 1727204085.40678: variable 'port1_profile' from source: play vars 12180 1727204085.40738: variable 'port1_profile' from source: play vars 12180 1727204085.40745: variable 'controller_profile' from source: play vars 12180 1727204085.40811: variable 'controller_profile' from source: play vars 12180 1727204085.40819: variable 'ansible_distribution' from source: facts 12180 1727204085.40822: variable '__network_rh_distros' from source: role '' defaults 12180 1727204085.40828: variable 'ansible_distribution_major_version' from source: facts 12180 1727204085.40844: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12180 1727204085.41014: variable 'ansible_distribution' from source: facts 12180 1727204085.41017: variable '__network_rh_distros' from source: role '' defaults 12180 1727204085.41022: variable 'ansible_distribution_major_version' from source: facts 12180 1727204085.41035: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12180 1727204085.41179: variable 'ansible_distribution' from source: facts 12180 1727204085.41183: variable '__network_rh_distros' from source: role '' defaults 12180 1727204085.41188: variable 'ansible_distribution_major_version' from source: facts 12180 1727204085.41223: variable 'network_provider' from source: set_fact 12180 1727204085.41237: variable 'ansible_facts' from source: unknown 12180 1727204085.41932: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12180 1727204085.41936: when evaluation is False, skipping this task 12180 1727204085.41939: _execute() done 12180 1727204085.41941: dumping result to json 12180 1727204085.41943: done dumping result, returning 12180 1727204085.41950: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-ccb1-55ae-000000000085] 12180 1727204085.41956: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000085 12180 1727204085.42065: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000085 12180 1727204085.42068: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12180 1727204085.42115: no more pending results, returning what we have 12180 1727204085.42119: results queue empty 12180 1727204085.42120: checking for any_errors_fatal 12180 1727204085.42126: done checking for any_errors_fatal 12180 1727204085.42127: checking for max_fail_percentage 12180 1727204085.42129: done checking for max_fail_percentage 12180 1727204085.42131: checking to see if all hosts have failed and the running result is not ok 12180 1727204085.42132: done checking to see if all hosts have failed 12180 1727204085.42133: getting the remaining hosts for this loop 12180 1727204085.42134: done getting the remaining hosts for this loop 12180 1727204085.42143: getting the next task for host managed-node1 12180 1727204085.42151: done getting next task for host managed-node1 12180 1727204085.42155: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12180 1727204085.42158: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204085.42180: getting variables 12180 1727204085.42181: in VariableManager get_vars() 12180 1727204085.42221: Calling all_inventory to load vars for managed-node1 12180 1727204085.42224: Calling groups_inventory to load vars for managed-node1 12180 1727204085.42226: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204085.42237: Calling all_plugins_play to load vars for managed-node1 12180 1727204085.42239: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204085.42242: Calling groups_plugins_play to load vars for managed-node1 12180 1727204085.43648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204085.45488: done with get_vars() 12180 1727204085.45518: done getting variables 12180 1727204085.45589: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.157) 0:00:32.868 ***** 12180 1727204085.45628: entering _queue_task() for managed-node1/package 12180 1727204085.45974: worker is 1 (out of 1 available) 12180 1727204085.45987: exiting _queue_task() for managed-node1/package 12180 1727204085.45999: done queuing things up, now waiting for results queue to drain 12180 1727204085.46000: waiting for pending results... 12180 1727204085.46306: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12180 1727204085.46477: in run() - task 0affcd87-79f5-ccb1-55ae-000000000086 12180 1727204085.46499: variable 'ansible_search_path' from source: unknown 12180 1727204085.46507: variable 'ansible_search_path' from source: unknown 12180 1727204085.46557: calling self._execute() 12180 1727204085.46660: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204085.46677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204085.46689: variable 'omit' from source: magic vars 12180 1727204085.47119: variable 'ansible_distribution_major_version' from source: facts 12180 1727204085.47142: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204085.47318: variable 'network_state' from source: role '' defaults 12180 1727204085.47339: Evaluated conditional (network_state != {}): False 12180 1727204085.47377: when evaluation is False, skipping this task 12180 1727204085.47385: _execute() done 12180 1727204085.47392: dumping result to json 12180 1727204085.47398: done dumping result, returning 12180 1727204085.47420: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-ccb1-55ae-000000000086] 12180 1727204085.47439: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000086 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12180 1727204085.47632: no more pending results, returning what we have 12180 1727204085.47637: results queue empty 12180 1727204085.47639: checking for any_errors_fatal 12180 1727204085.47645: done checking for any_errors_fatal 12180 1727204085.47646: checking for max_fail_percentage 12180 1727204085.47648: done checking for max_fail_percentage 12180 1727204085.47649: checking to see if all hosts have failed and the running result is not ok 12180 1727204085.47650: done checking to see if all hosts have failed 12180 1727204085.47651: getting the remaining hosts for this loop 12180 1727204085.47652: done getting the remaining hosts for this loop 12180 1727204085.47657: getting the next task for host managed-node1 12180 1727204085.47668: done getting next task for host managed-node1 12180 1727204085.47673: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12180 1727204085.47678: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204085.47702: getting variables 12180 1727204085.47704: in VariableManager get_vars() 12180 1727204085.47755: Calling all_inventory to load vars for managed-node1 12180 1727204085.47758: Calling groups_inventory to load vars for managed-node1 12180 1727204085.47761: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204085.47776: Calling all_plugins_play to load vars for managed-node1 12180 1727204085.47779: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204085.47782: Calling groups_plugins_play to load vars for managed-node1 12180 1727204085.48950: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000086 12180 1727204085.48954: WORKER PROCESS EXITING 12180 1727204085.50055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204085.51593: done with get_vars() 12180 1727204085.51618: done getting variables 12180 1727204085.51670: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.060) 0:00:32.928 ***** 12180 1727204085.51697: entering _queue_task() for managed-node1/package 12180 1727204085.51944: worker is 1 (out of 1 available) 12180 1727204085.51959: exiting _queue_task() for managed-node1/package 12180 1727204085.51972: done queuing things up, now waiting for results queue to drain 12180 1727204085.51974: waiting for pending results... 12180 1727204085.52162: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12180 1727204085.52267: in run() - task 0affcd87-79f5-ccb1-55ae-000000000087 12180 1727204085.52279: variable 'ansible_search_path' from source: unknown 12180 1727204085.52285: variable 'ansible_search_path' from source: unknown 12180 1727204085.52314: calling self._execute() 12180 1727204085.52388: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204085.52391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204085.52401: variable 'omit' from source: magic vars 12180 1727204085.52712: variable 'ansible_distribution_major_version' from source: facts 12180 1727204085.52716: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204085.53333: variable 'network_state' from source: role '' defaults 12180 1727204085.53337: Evaluated conditional (network_state != {}): False 12180 1727204085.53340: when evaluation is False, skipping this task 12180 1727204085.53343: _execute() done 12180 1727204085.53345: dumping result to json 12180 1727204085.53347: done dumping result, returning 12180 1727204085.53350: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-ccb1-55ae-000000000087] 12180 1727204085.53352: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000087 12180 1727204085.53425: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000087 12180 1727204085.53428: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12180 1727204085.53469: no more pending results, returning what we have 12180 1727204085.53472: results queue empty 12180 1727204085.53473: checking for any_errors_fatal 12180 1727204085.53477: done checking for any_errors_fatal 12180 1727204085.53478: checking for max_fail_percentage 12180 1727204085.53480: done checking for max_fail_percentage 12180 1727204085.53481: checking to see if all hosts have failed and the running result is not ok 12180 1727204085.53482: done checking to see if all hosts have failed 12180 1727204085.53483: getting the remaining hosts for this loop 12180 1727204085.53484: done getting the remaining hosts for this loop 12180 1727204085.53487: getting the next task for host managed-node1 12180 1727204085.53493: done getting next task for host managed-node1 12180 1727204085.53497: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12180 1727204085.53502: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204085.53517: getting variables 12180 1727204085.53519: in VariableManager get_vars() 12180 1727204085.53559: Calling all_inventory to load vars for managed-node1 12180 1727204085.53562: Calling groups_inventory to load vars for managed-node1 12180 1727204085.53566: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204085.53574: Calling all_plugins_play to load vars for managed-node1 12180 1727204085.53576: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204085.53579: Calling groups_plugins_play to load vars for managed-node1 12180 1727204085.55630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204085.57517: done with get_vars() 12180 1727204085.57542: done getting variables 12180 1727204085.57610: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.059) 0:00:32.988 ***** 12180 1727204085.57651: entering _queue_task() for managed-node1/service 12180 1727204085.58083: worker is 1 (out of 1 available) 12180 1727204085.58095: exiting _queue_task() for managed-node1/service 12180 1727204085.58107: done queuing things up, now waiting for results queue to drain 12180 1727204085.58108: waiting for pending results... 12180 1727204085.58405: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12180 1727204085.58575: in run() - task 0affcd87-79f5-ccb1-55ae-000000000088 12180 1727204085.58595: variable 'ansible_search_path' from source: unknown 12180 1727204085.58603: variable 'ansible_search_path' from source: unknown 12180 1727204085.58643: calling self._execute() 12180 1727204085.58741: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204085.58788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204085.58801: variable 'omit' from source: magic vars 12180 1727204085.59181: variable 'ansible_distribution_major_version' from source: facts 12180 1727204085.59203: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204085.59333: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204085.59541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204085.62696: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204085.62791: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204085.62841: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204085.62891: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204085.62925: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204085.63019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.63054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.63090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.63139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.63160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.63219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.63249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.63283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.63332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.63352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.63400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.63435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.63468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.63513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.63538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.63737: variable 'network_connections' from source: task vars 12180 1727204085.63761: variable 'port2_profile' from source: play vars 12180 1727204085.63837: variable 'port2_profile' from source: play vars 12180 1727204085.63855: variable 'port1_profile' from source: play vars 12180 1727204085.63925: variable 'port1_profile' from source: play vars 12180 1727204085.63939: variable 'controller_profile' from source: play vars 12180 1727204085.64012: variable 'controller_profile' from source: play vars 12180 1727204085.64098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204085.64293: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204085.64339: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204085.64379: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204085.64419: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204085.64468: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204085.64494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204085.64527: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.64560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204085.64626: variable '__network_team_connections_defined' from source: role '' defaults 12180 1727204085.64881: variable 'network_connections' from source: task vars 12180 1727204085.64892: variable 'port2_profile' from source: play vars 12180 1727204085.64961: variable 'port2_profile' from source: play vars 12180 1727204085.64978: variable 'port1_profile' from source: play vars 12180 1727204085.65146: variable 'port1_profile' from source: play vars 12180 1727204085.65157: variable 'controller_profile' from source: play vars 12180 1727204085.65349: variable 'controller_profile' from source: play vars 12180 1727204085.65386: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12180 1727204085.65403: when evaluation is False, skipping this task 12180 1727204085.65411: _execute() done 12180 1727204085.65418: dumping result to json 12180 1727204085.65425: done dumping result, returning 12180 1727204085.65440: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-ccb1-55ae-000000000088] 12180 1727204085.65450: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000088 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12180 1727204085.65611: no more pending results, returning what we have 12180 1727204085.65616: results queue empty 12180 1727204085.65617: checking for any_errors_fatal 12180 1727204085.65622: done checking for any_errors_fatal 12180 1727204085.65623: checking for max_fail_percentage 12180 1727204085.65625: done checking for max_fail_percentage 12180 1727204085.65626: checking to see if all hosts have failed and the running result is not ok 12180 1727204085.65626: done checking to see if all hosts have failed 12180 1727204085.65627: getting the remaining hosts for this loop 12180 1727204085.65629: done getting the remaining hosts for this loop 12180 1727204085.65633: getting the next task for host managed-node1 12180 1727204085.65642: done getting next task for host managed-node1 12180 1727204085.65646: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12180 1727204085.65650: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204085.65672: getting variables 12180 1727204085.65674: in VariableManager get_vars() 12180 1727204085.65720: Calling all_inventory to load vars for managed-node1 12180 1727204085.65723: Calling groups_inventory to load vars for managed-node1 12180 1727204085.65726: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204085.65738: Calling all_plugins_play to load vars for managed-node1 12180 1727204085.65741: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204085.65744: Calling groups_plugins_play to load vars for managed-node1 12180 1727204085.66982: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000088 12180 1727204085.66987: WORKER PROCESS EXITING 12180 1727204085.67640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204085.70028: done with get_vars() 12180 1727204085.70058: done getting variables 12180 1727204085.70121: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:45 -0400 (0:00:00.125) 0:00:33.113 ***** 12180 1727204085.70156: entering _queue_task() for managed-node1/service 12180 1727204085.70475: worker is 1 (out of 1 available) 12180 1727204085.70488: exiting _queue_task() for managed-node1/service 12180 1727204085.70503: done queuing things up, now waiting for results queue to drain 12180 1727204085.70504: waiting for pending results... 12180 1727204085.71007: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12180 1727204085.71173: in run() - task 0affcd87-79f5-ccb1-55ae-000000000089 12180 1727204085.71195: variable 'ansible_search_path' from source: unknown 12180 1727204085.71204: variable 'ansible_search_path' from source: unknown 12180 1727204085.71254: calling self._execute() 12180 1727204085.71402: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204085.71414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204085.71568: variable 'omit' from source: magic vars 12180 1727204085.72325: variable 'ansible_distribution_major_version' from source: facts 12180 1727204085.72345: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204085.72633: variable 'network_provider' from source: set_fact 12180 1727204085.72775: variable 'network_state' from source: role '' defaults 12180 1727204085.72827: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12180 1727204085.72940: variable 'omit' from source: magic vars 12180 1727204085.73018: variable 'omit' from source: magic vars 12180 1727204085.73091: variable 'network_service_name' from source: role '' defaults 12180 1727204085.73169: variable 'network_service_name' from source: role '' defaults 12180 1727204085.73302: variable '__network_provider_setup' from source: role '' defaults 12180 1727204085.73313: variable '__network_service_name_default_nm' from source: role '' defaults 12180 1727204085.73379: variable '__network_service_name_default_nm' from source: role '' defaults 12180 1727204085.73394: variable '__network_packages_default_nm' from source: role '' defaults 12180 1727204085.73459: variable '__network_packages_default_nm' from source: role '' defaults 12180 1727204085.73700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204085.76029: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204085.76116: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204085.76161: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204085.76202: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204085.76237: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204085.76320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.76359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.76392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.76442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.76461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.76511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.76538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.76572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.76615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.76633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.76886: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12180 1727204085.77011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.77040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.77070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.77118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.77136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.77241: variable 'ansible_python' from source: facts 12180 1727204085.77269: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12180 1727204085.77360: variable '__network_wpa_supplicant_required' from source: role '' defaults 12180 1727204085.77452: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12180 1727204085.77591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.77620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.77653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.77698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.77715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.77771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204085.77809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204085.77837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.77887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204085.77904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204085.78051: variable 'network_connections' from source: task vars 12180 1727204085.78071: variable 'port2_profile' from source: play vars 12180 1727204085.78149: variable 'port2_profile' from source: play vars 12180 1727204085.78174: variable 'port1_profile' from source: play vars 12180 1727204085.78249: variable 'port1_profile' from source: play vars 12180 1727204085.78269: variable 'controller_profile' from source: play vars 12180 1727204085.78348: variable 'controller_profile' from source: play vars 12180 1727204085.78470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204085.79031: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204085.79097: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204085.79148: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204085.79199: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204085.79279: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204085.79315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204085.79353: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204085.79397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204085.79453: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204085.79780: variable 'network_connections' from source: task vars 12180 1727204085.79792: variable 'port2_profile' from source: play vars 12180 1727204085.79881: variable 'port2_profile' from source: play vars 12180 1727204085.79899: variable 'port1_profile' from source: play vars 12180 1727204085.79983: variable 'port1_profile' from source: play vars 12180 1727204085.80000: variable 'controller_profile' from source: play vars 12180 1727204085.80084: variable 'controller_profile' from source: play vars 12180 1727204085.80124: variable '__network_packages_default_wireless' from source: role '' defaults 12180 1727204085.80214: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204085.80537: variable 'network_connections' from source: task vars 12180 1727204085.80547: variable 'port2_profile' from source: play vars 12180 1727204085.80628: variable 'port2_profile' from source: play vars 12180 1727204085.80641: variable 'port1_profile' from source: play vars 12180 1727204085.80718: variable 'port1_profile' from source: play vars 12180 1727204085.80731: variable 'controller_profile' from source: play vars 12180 1727204085.80811: variable 'controller_profile' from source: play vars 12180 1727204085.80841: variable '__network_packages_default_team' from source: role '' defaults 12180 1727204085.80929: variable '__network_team_connections_defined' from source: role '' defaults 12180 1727204085.81245: variable 'network_connections' from source: task vars 12180 1727204085.81256: variable 'port2_profile' from source: play vars 12180 1727204085.81335: variable 'port2_profile' from source: play vars 12180 1727204085.81348: variable 'port1_profile' from source: play vars 12180 1727204085.81421: variable 'port1_profile' from source: play vars 12180 1727204085.81438: variable 'controller_profile' from source: play vars 12180 1727204085.81513: variable 'controller_profile' from source: play vars 12180 1727204085.81577: variable '__network_service_name_default_initscripts' from source: role '' defaults 12180 1727204085.81639: variable '__network_service_name_default_initscripts' from source: role '' defaults 12180 1727204085.81655: variable '__network_packages_default_initscripts' from source: role '' defaults 12180 1727204085.81718: variable '__network_packages_default_initscripts' from source: role '' defaults 12180 1727204085.81942: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12180 1727204085.82480: variable 'network_connections' from source: task vars 12180 1727204085.82489: variable 'port2_profile' from source: play vars 12180 1727204085.82555: variable 'port2_profile' from source: play vars 12180 1727204085.82570: variable 'port1_profile' from source: play vars 12180 1727204085.82636: variable 'port1_profile' from source: play vars 12180 1727204085.82648: variable 'controller_profile' from source: play vars 12180 1727204085.82711: variable 'controller_profile' from source: play vars 12180 1727204085.82723: variable 'ansible_distribution' from source: facts 12180 1727204085.82736: variable '__network_rh_distros' from source: role '' defaults 12180 1727204085.82746: variable 'ansible_distribution_major_version' from source: facts 12180 1727204085.82771: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12180 1727204085.82955: variable 'ansible_distribution' from source: facts 12180 1727204085.82963: variable '__network_rh_distros' from source: role '' defaults 12180 1727204085.82975: variable 'ansible_distribution_major_version' from source: facts 12180 1727204085.82991: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12180 1727204085.83173: variable 'ansible_distribution' from source: facts 12180 1727204085.83182: variable '__network_rh_distros' from source: role '' defaults 12180 1727204085.83191: variable 'ansible_distribution_major_version' from source: facts 12180 1727204085.83232: variable 'network_provider' from source: set_fact 12180 1727204085.83258: variable 'omit' from source: magic vars 12180 1727204085.83293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204085.83325: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204085.83348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204085.83371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204085.83390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204085.83422: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204085.83430: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204085.83437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204085.83551: Set connection var ansible_pipelining to False 12180 1727204085.83559: Set connection var ansible_shell_type to sh 12180 1727204085.83572: Set connection var ansible_timeout to 10 12180 1727204085.83580: Set connection var ansible_connection to ssh 12180 1727204085.83589: Set connection var ansible_shell_executable to /bin/sh 12180 1727204085.83601: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204085.83634: variable 'ansible_shell_executable' from source: unknown 12180 1727204085.83641: variable 'ansible_connection' from source: unknown 12180 1727204085.83647: variable 'ansible_module_compression' from source: unknown 12180 1727204085.83653: variable 'ansible_shell_type' from source: unknown 12180 1727204085.83659: variable 'ansible_shell_executable' from source: unknown 12180 1727204085.83668: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204085.83676: variable 'ansible_pipelining' from source: unknown 12180 1727204085.83682: variable 'ansible_timeout' from source: unknown 12180 1727204085.83688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204085.83796: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204085.83816: variable 'omit' from source: magic vars 12180 1727204085.83826: starting attempt loop 12180 1727204085.83833: running the handler 12180 1727204085.83916: variable 'ansible_facts' from source: unknown 12180 1727204085.84669: _low_level_execute_command(): starting 12180 1727204085.84687: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204085.85424: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204085.85445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204085.85461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204085.85486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204085.85532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204085.85545: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204085.85563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204085.85584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204085.85597: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204085.85609: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204085.85622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204085.85636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204085.85654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204085.85672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204085.85684: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204085.85698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204085.85777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204085.85801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204085.85818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204085.85915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204085.87676: stdout chunk (state=3): >>>/root <<< 12180 1727204085.87912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204085.87916: stdout chunk (state=3): >>><<< 12180 1727204085.87918: stderr chunk (state=3): >>><<< 12180 1727204085.88036: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204085.88040: _low_level_execute_command(): starting 12180 1727204085.88044: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204085.879391-15161-135889604599360 `" && echo ansible-tmp-1727204085.879391-15161-135889604599360="` echo /root/.ansible/tmp/ansible-tmp-1727204085.879391-15161-135889604599360 `" ) && sleep 0' 12180 1727204085.88673: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204085.88717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204085.88733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204085.88753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204085.88800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204085.88820: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204085.88835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204085.88854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204085.88868: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204085.88879: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204085.88891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204085.88905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204085.88926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204085.88940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204085.88951: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204085.88967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204085.89050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204085.89077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204085.89095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204085.89222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204085.91071: stdout chunk (state=3): >>>ansible-tmp-1727204085.879391-15161-135889604599360=/root/.ansible/tmp/ansible-tmp-1727204085.879391-15161-135889604599360 <<< 12180 1727204085.91371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204085.91375: stdout chunk (state=3): >>><<< 12180 1727204085.91378: stderr chunk (state=3): >>><<< 12180 1727204085.91673: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204085.879391-15161-135889604599360=/root/.ansible/tmp/ansible-tmp-1727204085.879391-15161-135889604599360 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204085.91677: variable 'ansible_module_compression' from source: unknown 12180 1727204085.91680: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 12180 1727204085.91682: variable 'ansible_facts' from source: unknown 12180 1727204085.91768: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204085.879391-15161-135889604599360/AnsiballZ_systemd.py 12180 1727204085.92497: Sending initial data 12180 1727204085.92501: Sent initial data (155 bytes) 12180 1727204085.94950: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204085.94955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204085.94979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204085.94982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204085.95102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204085.95106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204085.95186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204085.95189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204085.95381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204085.97079: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204085.97125: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204085.97236: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpjvtnrr_a /root/.ansible/tmp/ansible-tmp-1727204085.879391-15161-135889604599360/AnsiballZ_systemd.py <<< 12180 1727204085.97297: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204086.00114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204086.00206: stderr chunk (state=3): >>><<< 12180 1727204086.00210: stdout chunk (state=3): >>><<< 12180 1727204086.00236: done transferring module to remote 12180 1727204086.00247: _low_level_execute_command(): starting 12180 1727204086.00251: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204085.879391-15161-135889604599360/ /root/.ansible/tmp/ansible-tmp-1727204085.879391-15161-135889604599360/AnsiballZ_systemd.py && sleep 0' 12180 1727204086.03132: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204086.03145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204086.03155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204086.03171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204086.03219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204086.03282: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204086.03292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204086.03305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204086.03313: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204086.03320: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204086.03329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204086.03341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204086.03353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204086.03361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204086.03371: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204086.03381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204086.03455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204086.03786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204086.03797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204086.04010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204086.05833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204086.05837: stdout chunk (state=3): >>><<< 12180 1727204086.05846: stderr chunk (state=3): >>><<< 12180 1727204086.05873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204086.05876: _low_level_execute_command(): starting 12180 1727204086.05879: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204085.879391-15161-135889604599360/AnsiballZ_systemd.py && sleep 0' 12180 1727204086.06727: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204086.06732: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204086.06748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204086.06786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204086.06790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 12180 1727204086.06802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204086.06807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204086.06814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204086.06820: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204086.06839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204086.06904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204086.06919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204086.06924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204086.07023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204086.32176: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 12180 1727204086.32183: stdout chunk (state=3): >>>service", "ControlGroupId": "2418", "MemoryCurrent": "14004224", "MemoryAvailable": "infinity", "CPUUsageNSec": "537898000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSi<<< 12180 1727204086.32193: stdout chunk (state=3): >>>gnal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12180 1727204086.33716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204086.33749: stderr chunk (state=3): >>><<< 12180 1727204086.33753: stdout chunk (state=3): >>><<< 12180 1727204086.34061: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "619", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ExecMainStartTimestampMonotonic": "28837083", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "619", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "14004224", "MemoryAvailable": "infinity", "CPUUsageNSec": "537898000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target network.service shutdown.target multi-user.target", "After": "dbus.socket systemd-journald.socket sysinit.target network-pre.target basic.target system.slice cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:21 EDT", "StateChangeTimestampMonotonic": "324827295", "InactiveExitTimestamp": "Tue 2024-09-24 14:49:25 EDT", "InactiveExitTimestampMonotonic": "28837278", "ActiveEnterTimestamp": "Tue 2024-09-24 14:49:27 EDT", "ActiveEnterTimestampMonotonic": "30313565", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:49:25 EDT", "ConditionTimestampMonotonic": "28833288", "AssertTimestamp": "Tue 2024-09-24 14:49:25 EDT", "AssertTimestampMonotonic": "28833291", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a065c0d4382c4b51bfc5a74ffa3d403d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204086.34079: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204085.879391-15161-135889604599360/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204086.34084: _low_level_execute_command(): starting 12180 1727204086.34086: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204085.879391-15161-135889604599360/ > /dev/null 2>&1 && sleep 0' 12180 1727204086.34673: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204086.34688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204086.34702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204086.34719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204086.34765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204086.34776: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204086.34786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204086.34799: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204086.34806: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204086.34813: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204086.34821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204086.34832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204086.34841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204086.34849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204086.34855: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204086.34869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204086.34947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204086.34962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204086.34969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204086.35656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204086.37498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204086.37613: stderr chunk (state=3): >>><<< 12180 1727204086.37616: stdout chunk (state=3): >>><<< 12180 1727204086.37619: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204086.37621: handler run complete 12180 1727204086.38075: attempt loop complete, returning result 12180 1727204086.38079: _execute() done 12180 1727204086.38081: dumping result to json 12180 1727204086.38083: done dumping result, returning 12180 1727204086.38086: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-ccb1-55ae-000000000089] 12180 1727204086.38088: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000089 12180 1727204086.38245: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000089 12180 1727204086.38249: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12180 1727204086.38324: no more pending results, returning what we have 12180 1727204086.38326: results queue empty 12180 1727204086.38327: checking for any_errors_fatal 12180 1727204086.38332: done checking for any_errors_fatal 12180 1727204086.38333: checking for max_fail_percentage 12180 1727204086.38334: done checking for max_fail_percentage 12180 1727204086.38335: checking to see if all hosts have failed and the running result is not ok 12180 1727204086.38336: done checking to see if all hosts have failed 12180 1727204086.38337: getting the remaining hosts for this loop 12180 1727204086.38338: done getting the remaining hosts for this loop 12180 1727204086.38341: getting the next task for host managed-node1 12180 1727204086.38347: done getting next task for host managed-node1 12180 1727204086.38350: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12180 1727204086.38354: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204086.38367: getting variables 12180 1727204086.38368: in VariableManager get_vars() 12180 1727204086.38403: Calling all_inventory to load vars for managed-node1 12180 1727204086.38406: Calling groups_inventory to load vars for managed-node1 12180 1727204086.38409: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204086.38420: Calling all_plugins_play to load vars for managed-node1 12180 1727204086.38423: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204086.38426: Calling groups_plugins_play to load vars for managed-node1 12180 1727204086.41519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204086.44654: done with get_vars() 12180 1727204086.44690: done getting variables 12180 1727204086.44760: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.746) 0:00:33.859 ***** 12180 1727204086.44799: entering _queue_task() for managed-node1/service 12180 1727204086.45830: worker is 1 (out of 1 available) 12180 1727204086.45842: exiting _queue_task() for managed-node1/service 12180 1727204086.45854: done queuing things up, now waiting for results queue to drain 12180 1727204086.45856: waiting for pending results... 12180 1727204086.46872: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12180 1727204086.47049: in run() - task 0affcd87-79f5-ccb1-55ae-00000000008a 12180 1727204086.47322: variable 'ansible_search_path' from source: unknown 12180 1727204086.47334: variable 'ansible_search_path' from source: unknown 12180 1727204086.47381: calling self._execute() 12180 1727204086.47679: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204086.47692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204086.47706: variable 'omit' from source: magic vars 12180 1727204086.48548: variable 'ansible_distribution_major_version' from source: facts 12180 1727204086.48571: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204086.48700: variable 'network_provider' from source: set_fact 12180 1727204086.48714: Evaluated conditional (network_provider == "nm"): True 12180 1727204086.48818: variable '__network_wpa_supplicant_required' from source: role '' defaults 12180 1727204086.48913: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12180 1727204086.49093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204086.51482: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204086.51568: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204086.51614: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204086.51661: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204086.51693: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204086.51798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204086.51835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204086.51870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204086.51919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204086.51942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204086.52001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204086.52032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204086.52065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204086.52114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204086.52137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204086.52186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204086.52218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204086.52248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204086.52291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204086.52315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204086.52481: variable 'network_connections' from source: task vars 12180 1727204086.52498: variable 'port2_profile' from source: play vars 12180 1727204086.52583: variable 'port2_profile' from source: play vars 12180 1727204086.52600: variable 'port1_profile' from source: play vars 12180 1727204086.52676: variable 'port1_profile' from source: play vars 12180 1727204086.52691: variable 'controller_profile' from source: play vars 12180 1727204086.52762: variable 'controller_profile' from source: play vars 12180 1727204086.52842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12180 1727204086.53027: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12180 1727204086.53076: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12180 1727204086.53111: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12180 1727204086.53835: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12180 1727204086.53889: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12180 1727204086.54095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12180 1727204086.54124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204086.54156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12180 1727204086.54211: variable '__network_wireless_connections_defined' from source: role '' defaults 12180 1727204086.54488: variable 'network_connections' from source: task vars 12180 1727204086.54983: variable 'port2_profile' from source: play vars 12180 1727204086.55059: variable 'port2_profile' from source: play vars 12180 1727204086.55076: variable 'port1_profile' from source: play vars 12180 1727204086.55144: variable 'port1_profile' from source: play vars 12180 1727204086.55171: variable 'controller_profile' from source: play vars 12180 1727204086.55238: variable 'controller_profile' from source: play vars 12180 1727204086.55279: Evaluated conditional (__network_wpa_supplicant_required): False 12180 1727204086.55376: when evaluation is False, skipping this task 12180 1727204086.55384: _execute() done 12180 1727204086.55391: dumping result to json 12180 1727204086.55398: done dumping result, returning 12180 1727204086.55409: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-ccb1-55ae-00000000008a] 12180 1727204086.55419: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000008a skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12180 1727204086.55603: no more pending results, returning what we have 12180 1727204086.55607: results queue empty 12180 1727204086.55609: checking for any_errors_fatal 12180 1727204086.55629: done checking for any_errors_fatal 12180 1727204086.55629: checking for max_fail_percentage 12180 1727204086.55631: done checking for max_fail_percentage 12180 1727204086.55632: checking to see if all hosts have failed and the running result is not ok 12180 1727204086.55633: done checking to see if all hosts have failed 12180 1727204086.55634: getting the remaining hosts for this loop 12180 1727204086.55635: done getting the remaining hosts for this loop 12180 1727204086.55639: getting the next task for host managed-node1 12180 1727204086.55647: done getting next task for host managed-node1 12180 1727204086.55651: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12180 1727204086.55656: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204086.55677: getting variables 12180 1727204086.55679: in VariableManager get_vars() 12180 1727204086.55724: Calling all_inventory to load vars for managed-node1 12180 1727204086.55727: Calling groups_inventory to load vars for managed-node1 12180 1727204086.55730: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204086.55741: Calling all_plugins_play to load vars for managed-node1 12180 1727204086.55744: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204086.55746: Calling groups_plugins_play to load vars for managed-node1 12180 1727204086.56383: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000008a 12180 1727204086.56387: WORKER PROCESS EXITING 12180 1727204086.58837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204086.61576: done with get_vars() 12180 1727204086.61613: done getting variables 12180 1727204086.61693: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.169) 0:00:34.029 ***** 12180 1727204086.61738: entering _queue_task() for managed-node1/service 12180 1727204086.62095: worker is 1 (out of 1 available) 12180 1727204086.62107: exiting _queue_task() for managed-node1/service 12180 1727204086.62119: done queuing things up, now waiting for results queue to drain 12180 1727204086.62121: waiting for pending results... 12180 1727204086.62437: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 12180 1727204086.62614: in run() - task 0affcd87-79f5-ccb1-55ae-00000000008b 12180 1727204086.62638: variable 'ansible_search_path' from source: unknown 12180 1727204086.62648: variable 'ansible_search_path' from source: unknown 12180 1727204086.62697: calling self._execute() 12180 1727204086.62814: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204086.62838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204086.62855: variable 'omit' from source: magic vars 12180 1727204086.63278: variable 'ansible_distribution_major_version' from source: facts 12180 1727204086.63297: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204086.63429: variable 'network_provider' from source: set_fact 12180 1727204086.63445: Evaluated conditional (network_provider == "initscripts"): False 12180 1727204086.63452: when evaluation is False, skipping this task 12180 1727204086.63459: _execute() done 12180 1727204086.63468: dumping result to json 12180 1727204086.63476: done dumping result, returning 12180 1727204086.63489: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-ccb1-55ae-00000000008b] 12180 1727204086.63501: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000008b skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12180 1727204086.63660: no more pending results, returning what we have 12180 1727204086.63669: results queue empty 12180 1727204086.63670: checking for any_errors_fatal 12180 1727204086.63679: done checking for any_errors_fatal 12180 1727204086.63680: checking for max_fail_percentage 12180 1727204086.63682: done checking for max_fail_percentage 12180 1727204086.63683: checking to see if all hosts have failed and the running result is not ok 12180 1727204086.63684: done checking to see if all hosts have failed 12180 1727204086.63684: getting the remaining hosts for this loop 12180 1727204086.63686: done getting the remaining hosts for this loop 12180 1727204086.63690: getting the next task for host managed-node1 12180 1727204086.63698: done getting next task for host managed-node1 12180 1727204086.63702: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12180 1727204086.63707: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204086.63729: getting variables 12180 1727204086.63734: in VariableManager get_vars() 12180 1727204086.63783: Calling all_inventory to load vars for managed-node1 12180 1727204086.63787: Calling groups_inventory to load vars for managed-node1 12180 1727204086.63789: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204086.63802: Calling all_plugins_play to load vars for managed-node1 12180 1727204086.63805: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204086.63809: Calling groups_plugins_play to load vars for managed-node1 12180 1727204086.64828: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000008b 12180 1727204086.64834: WORKER PROCESS EXITING 12180 1727204086.65796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204086.68455: done with get_vars() 12180 1727204086.68488: done getting variables 12180 1727204086.68558: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.068) 0:00:34.097 ***** 12180 1727204086.68597: entering _queue_task() for managed-node1/copy 12180 1727204086.68957: worker is 1 (out of 1 available) 12180 1727204086.68974: exiting _queue_task() for managed-node1/copy 12180 1727204086.68985: done queuing things up, now waiting for results queue to drain 12180 1727204086.68986: waiting for pending results... 12180 1727204086.69320: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12180 1727204086.69628: in run() - task 0affcd87-79f5-ccb1-55ae-00000000008c 12180 1727204086.69653: variable 'ansible_search_path' from source: unknown 12180 1727204086.69663: variable 'ansible_search_path' from source: unknown 12180 1727204086.69709: calling self._execute() 12180 1727204086.69856: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204086.69870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204086.69885: variable 'omit' from source: magic vars 12180 1727204086.70844: variable 'ansible_distribution_major_version' from source: facts 12180 1727204086.70888: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204086.71139: variable 'network_provider' from source: set_fact 12180 1727204086.71211: Evaluated conditional (network_provider == "initscripts"): False 12180 1727204086.71219: when evaluation is False, skipping this task 12180 1727204086.71226: _execute() done 12180 1727204086.71236: dumping result to json 12180 1727204086.71243: done dumping result, returning 12180 1727204086.71319: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-ccb1-55ae-00000000008c] 12180 1727204086.71335: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000008c skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12180 1727204086.71496: no more pending results, returning what we have 12180 1727204086.71500: results queue empty 12180 1727204086.71501: checking for any_errors_fatal 12180 1727204086.71507: done checking for any_errors_fatal 12180 1727204086.71507: checking for max_fail_percentage 12180 1727204086.71510: done checking for max_fail_percentage 12180 1727204086.71511: checking to see if all hosts have failed and the running result is not ok 12180 1727204086.71511: done checking to see if all hosts have failed 12180 1727204086.71512: getting the remaining hosts for this loop 12180 1727204086.71513: done getting the remaining hosts for this loop 12180 1727204086.71517: getting the next task for host managed-node1 12180 1727204086.71526: done getting next task for host managed-node1 12180 1727204086.71533: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12180 1727204086.71538: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204086.71560: getting variables 12180 1727204086.71562: in VariableManager get_vars() 12180 1727204086.71608: Calling all_inventory to load vars for managed-node1 12180 1727204086.71612: Calling groups_inventory to load vars for managed-node1 12180 1727204086.71615: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204086.71628: Calling all_plugins_play to load vars for managed-node1 12180 1727204086.71633: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204086.71636: Calling groups_plugins_play to load vars for managed-node1 12180 1727204086.72619: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000008c 12180 1727204086.72623: WORKER PROCESS EXITING 12180 1727204086.73790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204086.76018: done with get_vars() 12180 1727204086.76048: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:46 -0400 (0:00:00.076) 0:00:34.174 ***** 12180 1727204086.76267: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 12180 1727204086.76920: worker is 1 (out of 1 available) 12180 1727204086.76934: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 12180 1727204086.76947: done queuing things up, now waiting for results queue to drain 12180 1727204086.76948: waiting for pending results... 12180 1727204086.77261: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12180 1727204086.77439: in run() - task 0affcd87-79f5-ccb1-55ae-00000000008d 12180 1727204086.77460: variable 'ansible_search_path' from source: unknown 12180 1727204086.77470: variable 'ansible_search_path' from source: unknown 12180 1727204086.77519: calling self._execute() 12180 1727204086.77648: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204086.77659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204086.77676: variable 'omit' from source: magic vars 12180 1727204086.78086: variable 'ansible_distribution_major_version' from source: facts 12180 1727204086.78104: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204086.78114: variable 'omit' from source: magic vars 12180 1727204086.78193: variable 'omit' from source: magic vars 12180 1727204086.78373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12180 1727204086.80974: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12180 1727204086.81048: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12180 1727204086.81099: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12180 1727204086.81152: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12180 1727204086.81191: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12180 1727204086.81292: variable 'network_provider' from source: set_fact 12180 1727204086.82687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12180 1727204086.83056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12180 1727204086.83094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12180 1727204086.83152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12180 1727204086.83174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12180 1727204086.83273: variable 'omit' from source: magic vars 12180 1727204086.83409: variable 'omit' from source: magic vars 12180 1727204086.83533: variable 'network_connections' from source: task vars 12180 1727204086.83550: variable 'port2_profile' from source: play vars 12180 1727204086.83626: variable 'port2_profile' from source: play vars 12180 1727204086.83643: variable 'port1_profile' from source: play vars 12180 1727204086.83795: variable 'port1_profile' from source: play vars 12180 1727204086.83810: variable 'controller_profile' from source: play vars 12180 1727204086.83868: variable 'controller_profile' from source: play vars 12180 1727204086.84188: variable 'omit' from source: magic vars 12180 1727204086.84252: variable '__lsr_ansible_managed' from source: task vars 12180 1727204086.84411: variable '__lsr_ansible_managed' from source: task vars 12180 1727204086.84847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12180 1727204086.85308: Loaded config def from plugin (lookup/template) 12180 1727204086.85370: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12180 1727204086.85408: File lookup term: get_ansible_managed.j2 12180 1727204086.85415: variable 'ansible_search_path' from source: unknown 12180 1727204086.85423: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12180 1727204086.85449: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12180 1727204086.85474: variable 'ansible_search_path' from source: unknown 12180 1727204086.92301: variable 'ansible_managed' from source: unknown 12180 1727204086.92478: variable 'omit' from source: magic vars 12180 1727204086.92513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204086.92549: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204086.92586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204086.92610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204086.92626: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204086.92665: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204086.92686: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204086.92695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204086.92804: Set connection var ansible_pipelining to False 12180 1727204086.92814: Set connection var ansible_shell_type to sh 12180 1727204086.92824: Set connection var ansible_timeout to 10 12180 1727204086.92836: Set connection var ansible_connection to ssh 12180 1727204086.92845: Set connection var ansible_shell_executable to /bin/sh 12180 1727204086.92854: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204086.92897: variable 'ansible_shell_executable' from source: unknown 12180 1727204086.92908: variable 'ansible_connection' from source: unknown 12180 1727204086.92915: variable 'ansible_module_compression' from source: unknown 12180 1727204086.92920: variable 'ansible_shell_type' from source: unknown 12180 1727204086.92926: variable 'ansible_shell_executable' from source: unknown 12180 1727204086.92934: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204086.92941: variable 'ansible_pipelining' from source: unknown 12180 1727204086.92947: variable 'ansible_timeout' from source: unknown 12180 1727204086.92962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204086.93182: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204086.93242: variable 'omit' from source: magic vars 12180 1727204086.93259: starting attempt loop 12180 1727204086.93268: running the handler 12180 1727204086.93284: _low_level_execute_command(): starting 12180 1727204086.93295: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204086.93870: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204086.93886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204086.93912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204086.93926: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204086.93973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204086.93985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204086.94063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204086.95759: stdout chunk (state=3): >>>/root <<< 12180 1727204086.95988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204086.96074: stderr chunk (state=3): >>><<< 12180 1727204086.96078: stdout chunk (state=3): >>><<< 12180 1727204086.96101: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204086.96114: _low_level_execute_command(): starting 12180 1727204086.96121: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204086.9610152-15209-161533763965334 `" && echo ansible-tmp-1727204086.9610152-15209-161533763965334="` echo /root/.ansible/tmp/ansible-tmp-1727204086.9610152-15209-161533763965334 `" ) && sleep 0' 12180 1727204086.97479: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204086.97488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204086.97507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204086.97533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204086.97575: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204086.97583: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204086.97594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204086.97614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204086.97629: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204086.97635: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204086.97644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204086.97654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204086.97668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204086.97676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204086.97683: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204086.97693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204086.97777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204086.97791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204086.97801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204086.97890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204086.99748: stdout chunk (state=3): >>>ansible-tmp-1727204086.9610152-15209-161533763965334=/root/.ansible/tmp/ansible-tmp-1727204086.9610152-15209-161533763965334 <<< 12180 1727204086.99953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204087.00038: stderr chunk (state=3): >>><<< 12180 1727204087.00040: stdout chunk (state=3): >>><<< 12180 1727204087.00075: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204086.9610152-15209-161533763965334=/root/.ansible/tmp/ansible-tmp-1727204086.9610152-15209-161533763965334 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204087.00258: variable 'ansible_module_compression' from source: unknown 12180 1727204087.00270: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 12180 1727204087.00273: variable 'ansible_facts' from source: unknown 12180 1727204087.00389: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204086.9610152-15209-161533763965334/AnsiballZ_network_connections.py 12180 1727204087.00575: Sending initial data 12180 1727204087.00579: Sent initial data (168 bytes) 12180 1727204087.01612: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204087.01616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204087.01662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204087.01668: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204087.01670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204087.01716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204087.01724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204087.01732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204087.01853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204087.03557: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204087.03660: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204087.03710: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpze3pdkmr /root/.ansible/tmp/ansible-tmp-1727204086.9610152-15209-161533763965334/AnsiballZ_network_connections.py <<< 12180 1727204087.04228: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204087.05494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204087.05598: stderr chunk (state=3): >>><<< 12180 1727204087.05602: stdout chunk (state=3): >>><<< 12180 1727204087.05622: done transferring module to remote 12180 1727204087.05631: _low_level_execute_command(): starting 12180 1727204087.05638: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204086.9610152-15209-161533763965334/ /root/.ansible/tmp/ansible-tmp-1727204086.9610152-15209-161533763965334/AnsiballZ_network_connections.py && sleep 0' 12180 1727204087.06115: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204087.06121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204087.06159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204087.06163: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204087.06175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204087.06180: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 12180 1727204087.06187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204087.06194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204087.06204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204087.06209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204087.06280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204087.06284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204087.06345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204087.08086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204087.08170: stderr chunk (state=3): >>><<< 12180 1727204087.08174: stdout chunk (state=3): >>><<< 12180 1727204087.08200: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204087.08203: _low_level_execute_command(): starting 12180 1727204087.08208: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204086.9610152-15209-161533763965334/AnsiballZ_network_connections.py && sleep 0' 12180 1727204087.08894: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204087.08902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204087.08912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204087.08926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204087.08975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204087.08982: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204087.08992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204087.09005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204087.09013: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204087.09019: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204087.09027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204087.09036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204087.09047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204087.09061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204087.09069: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204087.09078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204087.09151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204087.09176: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204087.09188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204087.09284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204087.59132: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__9eel8k5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__9eel8k5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/24de85ff-9c9d-4d15-a711-da166680223b: error=unknown <<< 12180 1727204087.61049: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__9eel8k5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__9eel8k5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/6724accd-c5ba-48f8-ba5f-9a64052000e9: error=unknown <<< 12180 1727204087.62923: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__9eel8k5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 12180 1727204087.62928: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__9eel8k5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 12180 1727204087.62990: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/7454a8c0-a94c-487e-947e-0611b087626b: error=unknown <<< 12180 1727204087.63222: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12180 1727204087.64871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204087.64875: stdout chunk (state=3): >>><<< 12180 1727204087.64883: stderr chunk (state=3): >>><<< 12180 1727204087.64906: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__9eel8k5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__9eel8k5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/24de85ff-9c9d-4d15-a711-da166680223b: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__9eel8k5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__9eel8k5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/6724accd-c5ba-48f8-ba5f-9a64052000e9: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__9eel8k5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload__9eel8k5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/7454a8c0-a94c-487e-947e-0611b087626b: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204087.64958: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204086.9610152-15209-161533763965334/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204087.64970: _low_level_execute_command(): starting 12180 1727204087.64975: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204086.9610152-15209-161533763965334/ > /dev/null 2>&1 && sleep 0' 12180 1727204087.66258: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204087.66885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204087.66895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204087.66912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204087.66970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204087.66974: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204087.66976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204087.66989: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204087.66996: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204087.67003: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204087.67011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204087.67020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204087.67031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204087.67042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204087.67049: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204087.67058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204087.67136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204087.67155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204087.67170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204087.67270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204087.69202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204087.69206: stdout chunk (state=3): >>><<< 12180 1727204087.69214: stderr chunk (state=3): >>><<< 12180 1727204087.69231: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204087.69241: handler run complete 12180 1727204087.69275: attempt loop complete, returning result 12180 1727204087.69279: _execute() done 12180 1727204087.69281: dumping result to json 12180 1727204087.69287: done dumping result, returning 12180 1727204087.69298: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-ccb1-55ae-00000000008d] 12180 1727204087.69302: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000008d 12180 1727204087.69426: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000008d 12180 1727204087.69429: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 12180 1727204087.69538: no more pending results, returning what we have 12180 1727204087.69542: results queue empty 12180 1727204087.69543: checking for any_errors_fatal 12180 1727204087.69549: done checking for any_errors_fatal 12180 1727204087.69549: checking for max_fail_percentage 12180 1727204087.69552: done checking for max_fail_percentage 12180 1727204087.69552: checking to see if all hosts have failed and the running result is not ok 12180 1727204087.69553: done checking to see if all hosts have failed 12180 1727204087.69554: getting the remaining hosts for this loop 12180 1727204087.69555: done getting the remaining hosts for this loop 12180 1727204087.69558: getting the next task for host managed-node1 12180 1727204087.69567: done getting next task for host managed-node1 12180 1727204087.69571: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12180 1727204087.69575: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204087.69590: getting variables 12180 1727204087.69591: in VariableManager get_vars() 12180 1727204087.69629: Calling all_inventory to load vars for managed-node1 12180 1727204087.69632: Calling groups_inventory to load vars for managed-node1 12180 1727204087.69634: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204087.69643: Calling all_plugins_play to load vars for managed-node1 12180 1727204087.69646: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204087.69648: Calling groups_plugins_play to load vars for managed-node1 12180 1727204087.72595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204087.75689: done with get_vars() 12180 1727204087.75723: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:47 -0400 (0:00:01.002) 0:00:35.177 ***** 12180 1727204087.76521: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 12180 1727204087.76836: worker is 1 (out of 1 available) 12180 1727204087.76847: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 12180 1727204087.76860: done queuing things up, now waiting for results queue to drain 12180 1727204087.76862: waiting for pending results... 12180 1727204087.77646: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 12180 1727204087.77780: in run() - task 0affcd87-79f5-ccb1-55ae-00000000008e 12180 1727204087.77792: variable 'ansible_search_path' from source: unknown 12180 1727204087.77796: variable 'ansible_search_path' from source: unknown 12180 1727204087.77831: calling self._execute() 12180 1727204087.77924: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204087.77928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204087.77942: variable 'omit' from source: magic vars 12180 1727204087.78284: variable 'ansible_distribution_major_version' from source: facts 12180 1727204087.78296: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204087.79378: variable 'network_state' from source: role '' defaults 12180 1727204087.79390: Evaluated conditional (network_state != {}): False 12180 1727204087.79393: when evaluation is False, skipping this task 12180 1727204087.79396: _execute() done 12180 1727204087.79398: dumping result to json 12180 1727204087.79401: done dumping result, returning 12180 1727204087.79408: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-ccb1-55ae-00000000008e] 12180 1727204087.79415: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000008e 12180 1727204087.79510: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000008e 12180 1727204087.79513: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12180 1727204087.79586: no more pending results, returning what we have 12180 1727204087.79591: results queue empty 12180 1727204087.79592: checking for any_errors_fatal 12180 1727204087.79605: done checking for any_errors_fatal 12180 1727204087.79606: checking for max_fail_percentage 12180 1727204087.79608: done checking for max_fail_percentage 12180 1727204087.79609: checking to see if all hosts have failed and the running result is not ok 12180 1727204087.79610: done checking to see if all hosts have failed 12180 1727204087.79611: getting the remaining hosts for this loop 12180 1727204087.79612: done getting the remaining hosts for this loop 12180 1727204087.79616: getting the next task for host managed-node1 12180 1727204087.79625: done getting next task for host managed-node1 12180 1727204087.79629: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12180 1727204087.79633: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204087.79649: getting variables 12180 1727204087.79650: in VariableManager get_vars() 12180 1727204087.79687: Calling all_inventory to load vars for managed-node1 12180 1727204087.79690: Calling groups_inventory to load vars for managed-node1 12180 1727204087.79692: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204087.79700: Calling all_plugins_play to load vars for managed-node1 12180 1727204087.79702: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204087.79705: Calling groups_plugins_play to load vars for managed-node1 12180 1727204087.82947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204087.86141: done with get_vars() 12180 1727204087.86575: done getting variables 12180 1727204087.86635: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:47 -0400 (0:00:00.101) 0:00:35.278 ***** 12180 1727204087.86675: entering _queue_task() for managed-node1/debug 12180 1727204087.87302: worker is 1 (out of 1 available) 12180 1727204087.87314: exiting _queue_task() for managed-node1/debug 12180 1727204087.87327: done queuing things up, now waiting for results queue to drain 12180 1727204087.87328: waiting for pending results... 12180 1727204087.87654: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12180 1727204087.87839: in run() - task 0affcd87-79f5-ccb1-55ae-00000000008f 12180 1727204087.87862: variable 'ansible_search_path' from source: unknown 12180 1727204087.87873: variable 'ansible_search_path' from source: unknown 12180 1727204087.87926: calling self._execute() 12180 1727204087.88041: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204087.88053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204087.88070: variable 'omit' from source: magic vars 12180 1727204087.88475: variable 'ansible_distribution_major_version' from source: facts 12180 1727204087.88494: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204087.88505: variable 'omit' from source: magic vars 12180 1727204087.88584: variable 'omit' from source: magic vars 12180 1727204087.88626: variable 'omit' from source: magic vars 12180 1727204087.88679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204087.88723: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204087.88751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204087.88779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204087.88800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204087.88836: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204087.88846: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204087.88854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204087.88969: Set connection var ansible_pipelining to False 12180 1727204087.88978: Set connection var ansible_shell_type to sh 12180 1727204087.88990: Set connection var ansible_timeout to 10 12180 1727204087.89008: Set connection var ansible_connection to ssh 12180 1727204087.89019: Set connection var ansible_shell_executable to /bin/sh 12180 1727204087.89029: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204087.89067: variable 'ansible_shell_executable' from source: unknown 12180 1727204087.89077: variable 'ansible_connection' from source: unknown 12180 1727204087.89084: variable 'ansible_module_compression' from source: unknown 12180 1727204087.89090: variable 'ansible_shell_type' from source: unknown 12180 1727204087.89097: variable 'ansible_shell_executable' from source: unknown 12180 1727204087.89103: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204087.89118: variable 'ansible_pipelining' from source: unknown 12180 1727204087.89124: variable 'ansible_timeout' from source: unknown 12180 1727204087.89132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204087.89289: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204087.89307: variable 'omit' from source: magic vars 12180 1727204087.89317: starting attempt loop 12180 1727204087.89326: running the handler 12180 1727204087.89473: variable '__network_connections_result' from source: set_fact 12180 1727204087.89531: handler run complete 12180 1727204087.89561: attempt loop complete, returning result 12180 1727204087.89571: _execute() done 12180 1727204087.89578: dumping result to json 12180 1727204087.89586: done dumping result, returning 12180 1727204087.89598: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-ccb1-55ae-00000000008f] 12180 1727204087.89609: sending task result for task 0affcd87-79f5-ccb1-55ae-00000000008f ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 12180 1727204087.89788: no more pending results, returning what we have 12180 1727204087.89793: results queue empty 12180 1727204087.89794: checking for any_errors_fatal 12180 1727204087.89799: done checking for any_errors_fatal 12180 1727204087.89800: checking for max_fail_percentage 12180 1727204087.89802: done checking for max_fail_percentage 12180 1727204087.89803: checking to see if all hosts have failed and the running result is not ok 12180 1727204087.89804: done checking to see if all hosts have failed 12180 1727204087.89805: getting the remaining hosts for this loop 12180 1727204087.89806: done getting the remaining hosts for this loop 12180 1727204087.89810: getting the next task for host managed-node1 12180 1727204087.89819: done getting next task for host managed-node1 12180 1727204087.89825: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12180 1727204087.89831: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204087.89844: getting variables 12180 1727204087.89846: in VariableManager get_vars() 12180 1727204087.89895: Calling all_inventory to load vars for managed-node1 12180 1727204087.89898: Calling groups_inventory to load vars for managed-node1 12180 1727204087.89901: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204087.89912: Calling all_plugins_play to load vars for managed-node1 12180 1727204087.89915: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204087.89918: Calling groups_plugins_play to load vars for managed-node1 12180 1727204087.91773: done sending task result for task 0affcd87-79f5-ccb1-55ae-00000000008f 12180 1727204087.91777: WORKER PROCESS EXITING 12180 1727204087.93442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204087.96687: done with get_vars() 12180 1727204087.96724: done getting variables 12180 1727204087.96792: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:47 -0400 (0:00:00.101) 0:00:35.380 ***** 12180 1727204087.96831: entering _queue_task() for managed-node1/debug 12180 1727204087.97870: worker is 1 (out of 1 available) 12180 1727204087.97886: exiting _queue_task() for managed-node1/debug 12180 1727204087.97899: done queuing things up, now waiting for results queue to drain 12180 1727204087.97901: waiting for pending results... 12180 1727204087.98357: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12180 1727204087.98509: in run() - task 0affcd87-79f5-ccb1-55ae-000000000090 12180 1727204087.98532: variable 'ansible_search_path' from source: unknown 12180 1727204087.98541: variable 'ansible_search_path' from source: unknown 12180 1727204087.98589: calling self._execute() 12180 1727204087.98704: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204087.98721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204087.98834: variable 'omit' from source: magic vars 12180 1727204087.99333: variable 'ansible_distribution_major_version' from source: facts 12180 1727204087.99350: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204087.99369: variable 'omit' from source: magic vars 12180 1727204087.99436: variable 'omit' from source: magic vars 12180 1727204087.99617: variable 'omit' from source: magic vars 12180 1727204087.99669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204087.99840: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204087.99871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204087.99894: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204087.99918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204088.00003: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204088.00022: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204088.00031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204088.00226: Set connection var ansible_pipelining to False 12180 1727204088.00352: Set connection var ansible_shell_type to sh 12180 1727204088.00363: Set connection var ansible_timeout to 10 12180 1727204088.00375: Set connection var ansible_connection to ssh 12180 1727204088.00385: Set connection var ansible_shell_executable to /bin/sh 12180 1727204088.00394: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204088.00426: variable 'ansible_shell_executable' from source: unknown 12180 1727204088.00433: variable 'ansible_connection' from source: unknown 12180 1727204088.00440: variable 'ansible_module_compression' from source: unknown 12180 1727204088.00448: variable 'ansible_shell_type' from source: unknown 12180 1727204088.00460: variable 'ansible_shell_executable' from source: unknown 12180 1727204088.00468: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204088.00477: variable 'ansible_pipelining' from source: unknown 12180 1727204088.00484: variable 'ansible_timeout' from source: unknown 12180 1727204088.00575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204088.00831: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204088.00849: variable 'omit' from source: magic vars 12180 1727204088.00858: starting attempt loop 12180 1727204088.00946: running the handler 12180 1727204088.01003: variable '__network_connections_result' from source: set_fact 12180 1727204088.01110: variable '__network_connections_result' from source: set_fact 12180 1727204088.01252: handler run complete 12180 1727204088.01298: attempt loop complete, returning result 12180 1727204088.01306: _execute() done 12180 1727204088.01313: dumping result to json 12180 1727204088.01322: done dumping result, returning 12180 1727204088.01336: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-ccb1-55ae-000000000090] 12180 1727204088.01383: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000090 ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 12180 1727204088.01600: no more pending results, returning what we have 12180 1727204088.01604: results queue empty 12180 1727204088.01605: checking for any_errors_fatal 12180 1727204088.01615: done checking for any_errors_fatal 12180 1727204088.01616: checking for max_fail_percentage 12180 1727204088.01618: done checking for max_fail_percentage 12180 1727204088.01619: checking to see if all hosts have failed and the running result is not ok 12180 1727204088.01620: done checking to see if all hosts have failed 12180 1727204088.01620: getting the remaining hosts for this loop 12180 1727204088.01622: done getting the remaining hosts for this loop 12180 1727204088.01626: getting the next task for host managed-node1 12180 1727204088.01634: done getting next task for host managed-node1 12180 1727204088.01638: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12180 1727204088.01643: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204088.01656: getting variables 12180 1727204088.01658: in VariableManager get_vars() 12180 1727204088.01708: Calling all_inventory to load vars for managed-node1 12180 1727204088.01711: Calling groups_inventory to load vars for managed-node1 12180 1727204088.01714: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204088.01725: Calling all_plugins_play to load vars for managed-node1 12180 1727204088.01728: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204088.01731: Calling groups_plugins_play to load vars for managed-node1 12180 1727204088.03617: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000090 12180 1727204088.03622: WORKER PROCESS EXITING 12180 1727204088.06389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204088.09305: done with get_vars() 12180 1727204088.09341: done getting variables 12180 1727204088.09403: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:48 -0400 (0:00:00.126) 0:00:35.506 ***** 12180 1727204088.09440: entering _queue_task() for managed-node1/debug 12180 1727204088.09779: worker is 1 (out of 1 available) 12180 1727204088.09792: exiting _queue_task() for managed-node1/debug 12180 1727204088.09803: done queuing things up, now waiting for results queue to drain 12180 1727204088.09805: waiting for pending results... 12180 1727204088.10122: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12180 1727204088.10278: in run() - task 0affcd87-79f5-ccb1-55ae-000000000091 12180 1727204088.10300: variable 'ansible_search_path' from source: unknown 12180 1727204088.10308: variable 'ansible_search_path' from source: unknown 12180 1727204088.10351: calling self._execute() 12180 1727204088.10461: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204088.10479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204088.10495: variable 'omit' from source: magic vars 12180 1727204088.11283: variable 'ansible_distribution_major_version' from source: facts 12180 1727204088.11302: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204088.11514: variable 'network_state' from source: role '' defaults 12180 1727204088.11570: Evaluated conditional (network_state != {}): False 12180 1727204088.11579: when evaluation is False, skipping this task 12180 1727204088.11670: _execute() done 12180 1727204088.11679: dumping result to json 12180 1727204088.11687: done dumping result, returning 12180 1727204088.11699: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-ccb1-55ae-000000000091] 12180 1727204088.11710: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000091 skipping: [managed-node1] => { "false_condition": "network_state != {}" } 12180 1727204088.11867: no more pending results, returning what we have 12180 1727204088.11871: results queue empty 12180 1727204088.11873: checking for any_errors_fatal 12180 1727204088.11886: done checking for any_errors_fatal 12180 1727204088.11887: checking for max_fail_percentage 12180 1727204088.11889: done checking for max_fail_percentage 12180 1727204088.11890: checking to see if all hosts have failed and the running result is not ok 12180 1727204088.11891: done checking to see if all hosts have failed 12180 1727204088.11892: getting the remaining hosts for this loop 12180 1727204088.11894: done getting the remaining hosts for this loop 12180 1727204088.11898: getting the next task for host managed-node1 12180 1727204088.11908: done getting next task for host managed-node1 12180 1727204088.11913: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12180 1727204088.11918: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204088.11941: getting variables 12180 1727204088.11943: in VariableManager get_vars() 12180 1727204088.11992: Calling all_inventory to load vars for managed-node1 12180 1727204088.11995: Calling groups_inventory to load vars for managed-node1 12180 1727204088.11997: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204088.12011: Calling all_plugins_play to load vars for managed-node1 12180 1727204088.12013: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204088.12017: Calling groups_plugins_play to load vars for managed-node1 12180 1727204088.13383: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000091 12180 1727204088.13386: WORKER PROCESS EXITING 12180 1727204088.13876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204088.15804: done with get_vars() 12180 1727204088.15827: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:48 -0400 (0:00:00.064) 0:00:35.571 ***** 12180 1727204088.15930: entering _queue_task() for managed-node1/ping 12180 1727204088.16237: worker is 1 (out of 1 available) 12180 1727204088.16249: exiting _queue_task() for managed-node1/ping 12180 1727204088.16261: done queuing things up, now waiting for results queue to drain 12180 1727204088.16263: waiting for pending results... 12180 1727204088.17168: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 12180 1727204088.17580: in run() - task 0affcd87-79f5-ccb1-55ae-000000000092 12180 1727204088.17596: variable 'ansible_search_path' from source: unknown 12180 1727204088.17600: variable 'ansible_search_path' from source: unknown 12180 1727204088.17792: calling self._execute() 12180 1727204088.17922: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204088.17928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204088.17941: variable 'omit' from source: magic vars 12180 1727204088.18433: variable 'ansible_distribution_major_version' from source: facts 12180 1727204088.18452: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204088.18462: variable 'omit' from source: magic vars 12180 1727204088.18530: variable 'omit' from source: magic vars 12180 1727204088.18579: variable 'omit' from source: magic vars 12180 1727204088.18623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204088.18669: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204088.18694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204088.18711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204088.18722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204088.18783: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204088.18787: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204088.18789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204088.18912: Set connection var ansible_pipelining to False 12180 1727204088.18915: Set connection var ansible_shell_type to sh 12180 1727204088.18921: Set connection var ansible_timeout to 10 12180 1727204088.18927: Set connection var ansible_connection to ssh 12180 1727204088.18935: Set connection var ansible_shell_executable to /bin/sh 12180 1727204088.18941: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204088.18982: variable 'ansible_shell_executable' from source: unknown 12180 1727204088.18986: variable 'ansible_connection' from source: unknown 12180 1727204088.18993: variable 'ansible_module_compression' from source: unknown 12180 1727204088.18996: variable 'ansible_shell_type' from source: unknown 12180 1727204088.19003: variable 'ansible_shell_executable' from source: unknown 12180 1727204088.19006: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204088.19010: variable 'ansible_pipelining' from source: unknown 12180 1727204088.19012: variable 'ansible_timeout' from source: unknown 12180 1727204088.19017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204088.19320: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12180 1727204088.19338: variable 'omit' from source: magic vars 12180 1727204088.19343: starting attempt loop 12180 1727204088.19346: running the handler 12180 1727204088.19361: _low_level_execute_command(): starting 12180 1727204088.19371: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204088.21274: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204088.21290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.21305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.21327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.21373: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.21385: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204088.21398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.21418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204088.21431: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204088.21442: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204088.21453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.21469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.21484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.21496: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.21507: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204088.21520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.21604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204088.21628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204088.21650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204088.21769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204088.23496: stdout chunk (state=3): >>>/root <<< 12180 1727204088.23699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204088.23818: stdout chunk (state=3): >>><<< 12180 1727204088.23821: stderr chunk (state=3): >>><<< 12180 1727204088.23826: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204088.23828: _low_level_execute_command(): starting 12180 1727204088.23830: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204088.2372208-15282-97115466117485 `" && echo ansible-tmp-1727204088.2372208-15282-97115466117485="` echo /root/.ansible/tmp/ansible-tmp-1727204088.2372208-15282-97115466117485 `" ) && sleep 0' 12180 1727204088.24390: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204088.24411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.24430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.24449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.24492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.24505: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204088.24517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.24534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204088.24546: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204088.24557: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204088.24573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.24587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.24603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.24615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.24626: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204088.24640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.24720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204088.24744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204088.24762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204088.24862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204088.26770: stdout chunk (state=3): >>>ansible-tmp-1727204088.2372208-15282-97115466117485=/root/.ansible/tmp/ansible-tmp-1727204088.2372208-15282-97115466117485 <<< 12180 1727204088.26971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204088.26975: stdout chunk (state=3): >>><<< 12180 1727204088.26977: stderr chunk (state=3): >>><<< 12180 1727204088.27289: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204088.2372208-15282-97115466117485=/root/.ansible/tmp/ansible-tmp-1727204088.2372208-15282-97115466117485 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204088.27294: variable 'ansible_module_compression' from source: unknown 12180 1727204088.27296: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 12180 1727204088.27298: variable 'ansible_facts' from source: unknown 12180 1727204088.27300: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204088.2372208-15282-97115466117485/AnsiballZ_ping.py 12180 1727204088.27419: Sending initial data 12180 1727204088.27425: Sent initial data (152 bytes) 12180 1727204088.28379: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204088.28393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.28410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.28429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.28480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.28492: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204088.28505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.28522: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204088.28532: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204088.28543: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204088.28555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.28569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.28583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.28594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.28604: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204088.28616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.28695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204088.28716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204088.28730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204088.28816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204088.30556: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204088.30612: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204088.30672: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpjlm52ypb /root/.ansible/tmp/ansible-tmp-1727204088.2372208-15282-97115466117485/AnsiballZ_ping.py <<< 12180 1727204088.30718: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204088.31960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204088.32090: stderr chunk (state=3): >>><<< 12180 1727204088.32093: stdout chunk (state=3): >>><<< 12180 1727204088.32096: done transferring module to remote 12180 1727204088.32098: _low_level_execute_command(): starting 12180 1727204088.32105: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204088.2372208-15282-97115466117485/ /root/.ansible/tmp/ansible-tmp-1727204088.2372208-15282-97115466117485/AnsiballZ_ping.py && sleep 0' 12180 1727204088.32678: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204088.32692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.32707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.32724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.32775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.32787: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204088.32801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.32819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204088.32829: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204088.32841: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204088.32852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.32865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.32880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.32890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.32901: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204088.32915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.32998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204088.33014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204088.33028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204088.33172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204088.34891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204088.34974: stderr chunk (state=3): >>><<< 12180 1727204088.34978: stdout chunk (state=3): >>><<< 12180 1727204088.35092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204088.35096: _low_level_execute_command(): starting 12180 1727204088.35098: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204088.2372208-15282-97115466117485/AnsiballZ_ping.py && sleep 0' 12180 1727204088.35713: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204088.35732: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.35754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.35776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.35818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.35833: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204088.35853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.35874: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204088.35885: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204088.35895: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204088.35907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.35922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.35941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.35956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.35976: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204088.35991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.36069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204088.36098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204088.36117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204088.36220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204088.49505: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12180 1727204088.50595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204088.50599: stdout chunk (state=3): >>><<< 12180 1727204088.50601: stderr chunk (state=3): >>><<< 12180 1727204088.50733: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204088.50742: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204088.2372208-15282-97115466117485/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204088.50750: _low_level_execute_command(): starting 12180 1727204088.50752: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204088.2372208-15282-97115466117485/ > /dev/null 2>&1 && sleep 0' 12180 1727204088.52008: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204088.52027: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.52045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.52062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.52105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.52117: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204088.52133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.52150: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204088.52161: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204088.52177: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204088.52188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.52200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.52214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.52226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.52240: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204088.52252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.52333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204088.52357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204088.52375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204088.52459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204088.54707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204088.54712: stdout chunk (state=3): >>><<< 12180 1727204088.54717: stderr chunk (state=3): >>><<< 12180 1727204088.54720: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204088.54723: handler run complete 12180 1727204088.54725: attempt loop complete, returning result 12180 1727204088.54727: _execute() done 12180 1727204088.54731: dumping result to json 12180 1727204088.54734: done dumping result, returning 12180 1727204088.54736: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-ccb1-55ae-000000000092] 12180 1727204088.54739: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000092 12180 1727204088.54812: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000092 12180 1727204088.54816: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 12180 1727204088.54877: no more pending results, returning what we have 12180 1727204088.54881: results queue empty 12180 1727204088.54882: checking for any_errors_fatal 12180 1727204088.54887: done checking for any_errors_fatal 12180 1727204088.54888: checking for max_fail_percentage 12180 1727204088.54889: done checking for max_fail_percentage 12180 1727204088.54890: checking to see if all hosts have failed and the running result is not ok 12180 1727204088.54891: done checking to see if all hosts have failed 12180 1727204088.54892: getting the remaining hosts for this loop 12180 1727204088.54893: done getting the remaining hosts for this loop 12180 1727204088.54897: getting the next task for host managed-node1 12180 1727204088.54906: done getting next task for host managed-node1 12180 1727204088.54909: ^ task is: TASK: meta (role_complete) 12180 1727204088.54913: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204088.54924: getting variables 12180 1727204088.54926: in VariableManager get_vars() 12180 1727204088.54970: Calling all_inventory to load vars for managed-node1 12180 1727204088.54973: Calling groups_inventory to load vars for managed-node1 12180 1727204088.54976: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204088.54986: Calling all_plugins_play to load vars for managed-node1 12180 1727204088.54989: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204088.54992: Calling groups_plugins_play to load vars for managed-node1 12180 1727204088.60582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204088.62162: done with get_vars() 12180 1727204088.62186: done getting variables 12180 1727204088.62258: done queuing things up, now waiting for results queue to drain 12180 1727204088.62260: results queue empty 12180 1727204088.62261: checking for any_errors_fatal 12180 1727204088.62265: done checking for any_errors_fatal 12180 1727204088.62266: checking for max_fail_percentage 12180 1727204088.62267: done checking for max_fail_percentage 12180 1727204088.62268: checking to see if all hosts have failed and the running result is not ok 12180 1727204088.62269: done checking to see if all hosts have failed 12180 1727204088.62270: getting the remaining hosts for this loop 12180 1727204088.62270: done getting the remaining hosts for this loop 12180 1727204088.62273: getting the next task for host managed-node1 12180 1727204088.62277: done getting next task for host managed-node1 12180 1727204088.62280: ^ task is: TASK: Delete the device '{{ controller_device }}' 12180 1727204088.62282: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204088.62287: getting variables 12180 1727204088.62289: in VariableManager get_vars() 12180 1727204088.62303: Calling all_inventory to load vars for managed-node1 12180 1727204088.62306: Calling groups_inventory to load vars for managed-node1 12180 1727204088.62308: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204088.62313: Calling all_plugins_play to load vars for managed-node1 12180 1727204088.62315: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204088.62317: Calling groups_plugins_play to load vars for managed-node1 12180 1727204088.64416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204088.65323: done with get_vars() 12180 1727204088.65342: done getting variables 12180 1727204088.65377: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12180 1727204088.65454: variable 'controller_device' from source: play vars TASK [Delete the device 'deprecated-bond'] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:125 Tuesday 24 September 2024 14:54:48 -0400 (0:00:00.495) 0:00:36.066 ***** 12180 1727204088.65476: entering _queue_task() for managed-node1/command 12180 1727204088.65709: worker is 1 (out of 1 available) 12180 1727204088.65722: exiting _queue_task() for managed-node1/command 12180 1727204088.65738: done queuing things up, now waiting for results queue to drain 12180 1727204088.65740: waiting for pending results... 12180 1727204088.65923: running TaskExecutor() for managed-node1/TASK: Delete the device 'deprecated-bond' 12180 1727204088.66054: in run() - task 0affcd87-79f5-ccb1-55ae-0000000000c2 12180 1727204088.66084: variable 'ansible_search_path' from source: unknown 12180 1727204088.66145: calling self._execute() 12180 1727204088.66294: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204088.66303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204088.66314: variable 'omit' from source: magic vars 12180 1727204088.66723: variable 'ansible_distribution_major_version' from source: facts 12180 1727204088.66740: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204088.66751: variable 'omit' from source: magic vars 12180 1727204088.66783: variable 'omit' from source: magic vars 12180 1727204088.67007: variable 'controller_device' from source: play vars 12180 1727204088.67028: variable 'omit' from source: magic vars 12180 1727204088.67088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204088.67130: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204088.67172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204088.67216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204088.67233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204088.67279: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204088.67294: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204088.67302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204088.67418: Set connection var ansible_pipelining to False 12180 1727204088.67426: Set connection var ansible_shell_type to sh 12180 1727204088.67436: Set connection var ansible_timeout to 10 12180 1727204088.67445: Set connection var ansible_connection to ssh 12180 1727204088.67454: Set connection var ansible_shell_executable to /bin/sh 12180 1727204088.67462: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204088.67506: variable 'ansible_shell_executable' from source: unknown 12180 1727204088.67514: variable 'ansible_connection' from source: unknown 12180 1727204088.67521: variable 'ansible_module_compression' from source: unknown 12180 1727204088.67526: variable 'ansible_shell_type' from source: unknown 12180 1727204088.67532: variable 'ansible_shell_executable' from source: unknown 12180 1727204088.67538: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204088.67546: variable 'ansible_pipelining' from source: unknown 12180 1727204088.67552: variable 'ansible_timeout' from source: unknown 12180 1727204088.67559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204088.67736: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204088.67753: variable 'omit' from source: magic vars 12180 1727204088.67761: starting attempt loop 12180 1727204088.67770: running the handler 12180 1727204088.67790: _low_level_execute_command(): starting 12180 1727204088.67802: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204088.69077: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204088.69094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.69111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.69147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.69183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.69205: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.69210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.69213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.69264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204088.69278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204088.69346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204088.70898: stdout chunk (state=3): >>>/root <<< 12180 1727204088.71068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204088.71175: stderr chunk (state=3): >>><<< 12180 1727204088.71179: stdout chunk (state=3): >>><<< 12180 1727204088.71246: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204088.71250: _low_level_execute_command(): starting 12180 1727204088.71255: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204088.7119172-15310-210604424859706 `" && echo ansible-tmp-1727204088.7119172-15310-210604424859706="` echo /root/.ansible/tmp/ansible-tmp-1727204088.7119172-15310-210604424859706 `" ) && sleep 0' 12180 1727204088.71854: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204088.71873: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.71880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.71900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.71952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.71958: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204088.71972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.71991: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204088.71994: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204088.72011: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204088.72026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.72050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.72058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.72072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.72075: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204088.72089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.72195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204088.72212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204088.72224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204088.72360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204088.74263: stdout chunk (state=3): >>>ansible-tmp-1727204088.7119172-15310-210604424859706=/root/.ansible/tmp/ansible-tmp-1727204088.7119172-15310-210604424859706 <<< 12180 1727204088.74371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204088.74435: stderr chunk (state=3): >>><<< 12180 1727204088.74438: stdout chunk (state=3): >>><<< 12180 1727204088.74457: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204088.7119172-15310-210604424859706=/root/.ansible/tmp/ansible-tmp-1727204088.7119172-15310-210604424859706 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204088.74486: variable 'ansible_module_compression' from source: unknown 12180 1727204088.74534: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12180 1727204088.74568: variable 'ansible_facts' from source: unknown 12180 1727204088.74636: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204088.7119172-15310-210604424859706/AnsiballZ_command.py 12180 1727204088.74749: Sending initial data 12180 1727204088.74752: Sent initial data (156 bytes) 12180 1727204088.75467: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.75471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.75508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204088.75512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.75515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 12180 1727204088.75517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.75570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204088.75574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204088.75584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204088.75639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204088.77382: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204088.77428: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204088.77533: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmp5yrrwfxd /root/.ansible/tmp/ansible-tmp-1727204088.7119172-15310-210604424859706/AnsiballZ_command.py <<< 12180 1727204088.77583: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204088.78422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204088.78537: stderr chunk (state=3): >>><<< 12180 1727204088.78541: stdout chunk (state=3): >>><<< 12180 1727204088.78557: done transferring module to remote 12180 1727204088.78568: _low_level_execute_command(): starting 12180 1727204088.78574: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204088.7119172-15310-210604424859706/ /root/.ansible/tmp/ansible-tmp-1727204088.7119172-15310-210604424859706/AnsiballZ_command.py && sleep 0' 12180 1727204088.79036: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.79042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.79077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204088.79091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 12180 1727204088.79104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.79143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204088.79155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204088.79217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204088.80961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204088.81016: stderr chunk (state=3): >>><<< 12180 1727204088.81019: stdout chunk (state=3): >>><<< 12180 1727204088.81036: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204088.81039: _low_level_execute_command(): starting 12180 1727204088.81044: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204088.7119172-15310-210604424859706/AnsiballZ_command.py && sleep 0' 12180 1727204088.81483: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.81488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.81523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204088.81537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.81549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.81596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204088.81608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204088.81707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204088.95394: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"deprecated-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "deprecated-bond"], "start": "2024-09-24 14:54:48.946298", "end": "2024-09-24 14:54:48.953362", "delta": "0:00:00.007064", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12180 1727204088.96521: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.148 closed. <<< 12180 1727204088.96601: stderr chunk (state=3): >>><<< 12180 1727204088.96604: stdout chunk (state=3): >>><<< 12180 1727204088.96673: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"deprecated-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "deprecated-bond"], "start": "2024-09-24 14:54:48.946298", "end": "2024-09-24 14:54:48.953362", "delta": "0:00:00.007064", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.148 closed. 12180 1727204088.96989: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204088.7119172-15310-210604424859706/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204088.96993: _low_level_execute_command(): starting 12180 1727204088.96996: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204088.7119172-15310-210604424859706/ > /dev/null 2>&1 && sleep 0' 12180 1727204088.97933: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204088.98001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.98012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.98026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.98065: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.98072: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204088.98082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.98096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204088.98232: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204088.98236: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204088.98246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204088.98255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204088.98268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204088.98277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204088.98285: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204088.98293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204088.98371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204088.98402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204088.98423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204088.98520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204089.00320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204089.00323: stdout chunk (state=3): >>><<< 12180 1727204089.00334: stderr chunk (state=3): >>><<< 12180 1727204089.00347: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204089.00356: handler run complete 12180 1727204089.00381: Evaluated conditional (False): False 12180 1727204089.00385: Evaluated conditional (False): False 12180 1727204089.00397: attempt loop complete, returning result 12180 1727204089.00401: _execute() done 12180 1727204089.00404: dumping result to json 12180 1727204089.00408: done dumping result, returning 12180 1727204089.00419: done running TaskExecutor() for managed-node1/TASK: Delete the device 'deprecated-bond' [0affcd87-79f5-ccb1-55ae-0000000000c2] 12180 1727204089.00425: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000c2 12180 1727204089.01042: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000c2 12180 1727204089.01048: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "link", "del", "deprecated-bond" ], "delta": "0:00:00.007064", "end": "2024-09-24 14:54:48.953362", "failed_when_result": false, "rc": 1, "start": "2024-09-24 14:54:48.946298" } STDERR: Cannot find device "deprecated-bond" MSG: non-zero return code 12180 1727204089.01117: no more pending results, returning what we have 12180 1727204089.01120: results queue empty 12180 1727204089.01121: checking for any_errors_fatal 12180 1727204089.01123: done checking for any_errors_fatal 12180 1727204089.01123: checking for max_fail_percentage 12180 1727204089.01125: done checking for max_fail_percentage 12180 1727204089.01126: checking to see if all hosts have failed and the running result is not ok 12180 1727204089.01127: done checking to see if all hosts have failed 12180 1727204089.01127: getting the remaining hosts for this loop 12180 1727204089.01129: done getting the remaining hosts for this loop 12180 1727204089.01132: getting the next task for host managed-node1 12180 1727204089.01139: done getting next task for host managed-node1 12180 1727204089.01142: ^ task is: TASK: Remove test interfaces 12180 1727204089.01145: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204089.01149: getting variables 12180 1727204089.01150: in VariableManager get_vars() 12180 1727204089.01193: Calling all_inventory to load vars for managed-node1 12180 1727204089.01196: Calling groups_inventory to load vars for managed-node1 12180 1727204089.01198: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204089.01209: Calling all_plugins_play to load vars for managed-node1 12180 1727204089.01211: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204089.01214: Calling groups_plugins_play to load vars for managed-node1 12180 1727204089.02623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204089.04398: done with get_vars() 12180 1727204089.04422: done getting variables 12180 1727204089.04472: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.390) 0:00:36.456 ***** 12180 1727204089.04498: entering _queue_task() for managed-node1/shell 12180 1727204089.04737: worker is 1 (out of 1 available) 12180 1727204089.04749: exiting _queue_task() for managed-node1/shell 12180 1727204089.04763: done queuing things up, now waiting for results queue to drain 12180 1727204089.04766: waiting for pending results... 12180 1727204089.05048: running TaskExecutor() for managed-node1/TASK: Remove test interfaces 12180 1727204089.05150: in run() - task 0affcd87-79f5-ccb1-55ae-0000000000c6 12180 1727204089.05163: variable 'ansible_search_path' from source: unknown 12180 1727204089.05169: variable 'ansible_search_path' from source: unknown 12180 1727204089.05205: calling self._execute() 12180 1727204089.05310: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204089.05314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204089.05324: variable 'omit' from source: magic vars 12180 1727204089.05701: variable 'ansible_distribution_major_version' from source: facts 12180 1727204089.05712: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204089.05719: variable 'omit' from source: magic vars 12180 1727204089.05808: variable 'omit' from source: magic vars 12180 1727204089.06392: variable 'dhcp_interface1' from source: play vars 12180 1727204089.06404: variable 'dhcp_interface2' from source: play vars 12180 1727204089.06438: variable 'omit' from source: magic vars 12180 1727204089.06496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204089.06538: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204089.06576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204089.06599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204089.06616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204089.06650: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204089.06659: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204089.06669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204089.06779: Set connection var ansible_pipelining to False 12180 1727204089.06795: Set connection var ansible_shell_type to sh 12180 1727204089.06806: Set connection var ansible_timeout to 10 12180 1727204089.06816: Set connection var ansible_connection to ssh 12180 1727204089.06825: Set connection var ansible_shell_executable to /bin/sh 12180 1727204089.06835: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204089.06872: variable 'ansible_shell_executable' from source: unknown 12180 1727204089.06880: variable 'ansible_connection' from source: unknown 12180 1727204089.06887: variable 'ansible_module_compression' from source: unknown 12180 1727204089.06900: variable 'ansible_shell_type' from source: unknown 12180 1727204089.06908: variable 'ansible_shell_executable' from source: unknown 12180 1727204089.06915: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204089.06924: variable 'ansible_pipelining' from source: unknown 12180 1727204089.06931: variable 'ansible_timeout' from source: unknown 12180 1727204089.06939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204089.07093: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204089.07109: variable 'omit' from source: magic vars 12180 1727204089.07125: starting attempt loop 12180 1727204089.07132: running the handler 12180 1727204089.07146: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204089.07172: _low_level_execute_command(): starting 12180 1727204089.07184: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204089.08025: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.08054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.08058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.08061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.08132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204089.08137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204089.08143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204089.08201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204089.10169: stdout chunk (state=3): >>>/root <<< 12180 1727204089.10289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204089.10340: stderr chunk (state=3): >>><<< 12180 1727204089.10343: stdout chunk (state=3): >>><<< 12180 1727204089.10476: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204089.10480: _low_level_execute_command(): starting 12180 1727204089.10483: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204089.1036572-15333-177222214836832 `" && echo ansible-tmp-1727204089.1036572-15333-177222214836832="` echo /root/.ansible/tmp/ansible-tmp-1727204089.1036572-15333-177222214836832 `" ) && sleep 0' 12180 1727204089.11217: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204089.11236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.11255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12180 1727204089.11259: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.11337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204089.11344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204089.11347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204089.11450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204089.13284: stdout chunk (state=3): >>>ansible-tmp-1727204089.1036572-15333-177222214836832=/root/.ansible/tmp/ansible-tmp-1727204089.1036572-15333-177222214836832 <<< 12180 1727204089.13397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204089.13447: stderr chunk (state=3): >>><<< 12180 1727204089.13452: stdout chunk (state=3): >>><<< 12180 1727204089.13474: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204089.1036572-15333-177222214836832=/root/.ansible/tmp/ansible-tmp-1727204089.1036572-15333-177222214836832 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204089.13498: variable 'ansible_module_compression' from source: unknown 12180 1727204089.13542: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12180 1727204089.13575: variable 'ansible_facts' from source: unknown 12180 1727204089.13654: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204089.1036572-15333-177222214836832/AnsiballZ_command.py 12180 1727204089.13865: Sending initial data 12180 1727204089.13870: Sent initial data (156 bytes) 12180 1727204089.14723: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204089.14726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204089.14748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.14754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.14793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204089.14799: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204089.14809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.14823: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204089.14830: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204089.14840: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204089.14849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204089.14859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.14878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.14881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204089.14886: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204089.14896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.14967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204089.14992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204089.15000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204089.15139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204089.16863: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204089.16948: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204089.16999: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpqtzm9k93 /root/.ansible/tmp/ansible-tmp-1727204089.1036572-15333-177222214836832/AnsiballZ_command.py <<< 12180 1727204089.17048: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204089.17907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204089.18015: stderr chunk (state=3): >>><<< 12180 1727204089.18018: stdout chunk (state=3): >>><<< 12180 1727204089.18037: done transferring module to remote 12180 1727204089.18046: _low_level_execute_command(): starting 12180 1727204089.18051: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204089.1036572-15333-177222214836832/ /root/.ansible/tmp/ansible-tmp-1727204089.1036572-15333-177222214836832/AnsiballZ_command.py && sleep 0' 12180 1727204089.18499: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204089.18505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.18538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204089.18555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.18569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.18611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204089.18623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204089.18684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204089.20486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204089.20534: stderr chunk (state=3): >>><<< 12180 1727204089.20539: stdout chunk (state=3): >>><<< 12180 1727204089.20557: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204089.20561: _low_level_execute_command(): starting 12180 1727204089.20564: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204089.1036572-15333-177222214836832/AnsiballZ_command.py && sleep 0' 12180 1727204089.21020: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.21024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.21058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204089.21061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.21065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.21111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204089.21131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204089.21198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204089.21238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204089.40884: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:54:49.343167", "end": "2024-09-24 14:54:49.408183", "delta": "0:00:00.065016", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12180 1727204089.42198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204089.42202: stdout chunk (state=3): >>><<< 12180 1727204089.42205: stderr chunk (state=3): >>><<< 12180 1727204089.42425: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:54:49.343167", "end": "2024-09-24 14:54:49.408183", "delta": "0:00:00.065016", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204089.42437: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204089.1036572-15333-177222214836832/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204089.42439: _low_level_execute_command(): starting 12180 1727204089.42441: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204089.1036572-15333-177222214836832/ > /dev/null 2>&1 && sleep 0' 12180 1727204089.44468: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204089.44477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204089.44487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.44501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.44541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204089.45180: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204089.45191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.45204: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204089.45212: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204089.45219: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204089.45226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204089.45235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.45247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.45254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204089.45261: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204089.45272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.45340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204089.45356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204089.45366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204089.45447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204089.47420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204089.47424: stdout chunk (state=3): >>><<< 12180 1727204089.47432: stderr chunk (state=3): >>><<< 12180 1727204089.47448: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204089.47455: handler run complete 12180 1727204089.47483: Evaluated conditional (False): False 12180 1727204089.47494: attempt loop complete, returning result 12180 1727204089.47497: _execute() done 12180 1727204089.47499: dumping result to json 12180 1727204089.47505: done dumping result, returning 12180 1727204089.47513: done running TaskExecutor() for managed-node1/TASK: Remove test interfaces [0affcd87-79f5-ccb1-55ae-0000000000c6] 12180 1727204089.47519: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000c6 12180 1727204089.47626: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000c6 12180 1727204089.47629: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.065016", "end": "2024-09-24 14:54:49.408183", "rc": 0, "start": "2024-09-24 14:54:49.343167" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 12180 1727204089.47719: no more pending results, returning what we have 12180 1727204089.47722: results queue empty 12180 1727204089.47723: checking for any_errors_fatal 12180 1727204089.47738: done checking for any_errors_fatal 12180 1727204089.47738: checking for max_fail_percentage 12180 1727204089.47740: done checking for max_fail_percentage 12180 1727204089.47741: checking to see if all hosts have failed and the running result is not ok 12180 1727204089.47742: done checking to see if all hosts have failed 12180 1727204089.47743: getting the remaining hosts for this loop 12180 1727204089.47744: done getting the remaining hosts for this loop 12180 1727204089.47747: getting the next task for host managed-node1 12180 1727204089.47754: done getting next task for host managed-node1 12180 1727204089.47757: ^ task is: TASK: Stop dnsmasq/radvd services 12180 1727204089.47760: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204089.47769: getting variables 12180 1727204089.47771: in VariableManager get_vars() 12180 1727204089.47810: Calling all_inventory to load vars for managed-node1 12180 1727204089.47813: Calling groups_inventory to load vars for managed-node1 12180 1727204089.47815: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204089.47826: Calling all_plugins_play to load vars for managed-node1 12180 1727204089.47828: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204089.47833: Calling groups_plugins_play to load vars for managed-node1 12180 1727204089.50128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204089.52606: done with get_vars() 12180 1727204089.52638: done getting variables 12180 1727204089.52704: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Tuesday 24 September 2024 14:54:49 -0400 (0:00:00.482) 0:00:36.939 ***** 12180 1727204089.52739: entering _queue_task() for managed-node1/shell 12180 1727204089.53073: worker is 1 (out of 1 available) 12180 1727204089.53085: exiting _queue_task() for managed-node1/shell 12180 1727204089.53098: done queuing things up, now waiting for results queue to drain 12180 1727204089.53099: waiting for pending results... 12180 1727204089.53423: running TaskExecutor() for managed-node1/TASK: Stop dnsmasq/radvd services 12180 1727204089.53582: in run() - task 0affcd87-79f5-ccb1-55ae-0000000000c7 12180 1727204089.53602: variable 'ansible_search_path' from source: unknown 12180 1727204089.53610: variable 'ansible_search_path' from source: unknown 12180 1727204089.53656: calling self._execute() 12180 1727204089.53776: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204089.53789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204089.53803: variable 'omit' from source: magic vars 12180 1727204089.54226: variable 'ansible_distribution_major_version' from source: facts 12180 1727204089.54246: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204089.54257: variable 'omit' from source: magic vars 12180 1727204089.54321: variable 'omit' from source: magic vars 12180 1727204089.54368: variable 'omit' from source: magic vars 12180 1727204089.54443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204089.54492: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204089.54522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204089.54578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204089.54602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204089.55413: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204089.55422: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204089.55432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204089.55651: Set connection var ansible_pipelining to False 12180 1727204089.55661: Set connection var ansible_shell_type to sh 12180 1727204089.55675: Set connection var ansible_timeout to 10 12180 1727204089.55686: Set connection var ansible_connection to ssh 12180 1727204089.55697: Set connection var ansible_shell_executable to /bin/sh 12180 1727204089.55709: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204089.55750: variable 'ansible_shell_executable' from source: unknown 12180 1727204089.55834: variable 'ansible_connection' from source: unknown 12180 1727204089.55843: variable 'ansible_module_compression' from source: unknown 12180 1727204089.55851: variable 'ansible_shell_type' from source: unknown 12180 1727204089.55858: variable 'ansible_shell_executable' from source: unknown 12180 1727204089.55867: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204089.55876: variable 'ansible_pipelining' from source: unknown 12180 1727204089.55882: variable 'ansible_timeout' from source: unknown 12180 1727204089.55889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204089.56244: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204089.56382: variable 'omit' from source: magic vars 12180 1727204089.56392: starting attempt loop 12180 1727204089.56400: running the handler 12180 1727204089.56416: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204089.56444: _low_level_execute_command(): starting 12180 1727204089.56457: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204089.59163: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.59183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204089.59399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.59403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204089.59406: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.59478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204089.59491: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204089.59749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204089.61385: stdout chunk (state=3): >>>/root <<< 12180 1727204089.61541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204089.61627: stderr chunk (state=3): >>><<< 12180 1727204089.61633: stdout chunk (state=3): >>><<< 12180 1727204089.61757: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204089.61761: _low_level_execute_command(): starting 12180 1727204089.61763: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204089.616577-15361-259083841372621 `" && echo ansible-tmp-1727204089.616577-15361-259083841372621="` echo /root/.ansible/tmp/ansible-tmp-1727204089.616577-15361-259083841372621 `" ) && sleep 0' 12180 1727204089.62800: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204089.62804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.62845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204089.62849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.62853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.63038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204089.63041: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204089.63106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204089.63194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204089.65007: stdout chunk (state=3): >>>ansible-tmp-1727204089.616577-15361-259083841372621=/root/.ansible/tmp/ansible-tmp-1727204089.616577-15361-259083841372621 <<< 12180 1727204089.65234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204089.65303: stderr chunk (state=3): >>><<< 12180 1727204089.65307: stdout chunk (state=3): >>><<< 12180 1727204089.65328: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204089.616577-15361-259083841372621=/root/.ansible/tmp/ansible-tmp-1727204089.616577-15361-259083841372621 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204089.65361: variable 'ansible_module_compression' from source: unknown 12180 1727204089.65418: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12180 1727204089.65455: variable 'ansible_facts' from source: unknown 12180 1727204089.65548: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204089.616577-15361-259083841372621/AnsiballZ_command.py 12180 1727204089.65709: Sending initial data 12180 1727204089.65713: Sent initial data (155 bytes) 12180 1727204089.67340: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204089.67394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204089.67397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.67400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.67402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204089.67405: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204089.67407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.67419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204089.67426: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204089.67435: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204089.67442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204089.67452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.67465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.67481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204089.67484: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204089.67490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.67560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204089.67576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204089.67585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204089.67671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204089.69375: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204089.69417: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204089.69501: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpzqx_yd0h /root/.ansible/tmp/ansible-tmp-1727204089.616577-15361-259083841372621/AnsiballZ_command.py <<< 12180 1727204089.69546: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204089.71092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204089.71288: stderr chunk (state=3): >>><<< 12180 1727204089.71292: stdout chunk (state=3): >>><<< 12180 1727204089.71294: done transferring module to remote 12180 1727204089.71296: _low_level_execute_command(): starting 12180 1727204089.71298: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204089.616577-15361-259083841372621/ /root/.ansible/tmp/ansible-tmp-1727204089.616577-15361-259083841372621/AnsiballZ_command.py && sleep 0' 12180 1727204089.71949: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204089.71970: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204089.71985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.72002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.72047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204089.72063: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204089.72089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.72106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204089.72117: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204089.72127: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204089.72142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204089.72155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.72175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.72187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204089.72200: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204089.72213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.72296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204089.72321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204089.72338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204089.72424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204089.74183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204089.74302: stderr chunk (state=3): >>><<< 12180 1727204089.74305: stdout chunk (state=3): >>><<< 12180 1727204089.74471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204089.74475: _low_level_execute_command(): starting 12180 1727204089.74477: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204089.616577-15361-259083841372621/AnsiballZ_command.py && sleep 0' 12180 1727204089.75715: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204089.75812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.75849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.75899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204089.75907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 12180 1727204089.75913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204089.75918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.75948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 12180 1727204089.75953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.76042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204089.76046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204089.76059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204089.76281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204089.91702: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:54:49.896949", "end": "2024-09-24 14:54:49.916299", "delta": "0:00:00.019350", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12180 1727204089.92924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204089.92928: stdout chunk (state=3): >>><<< 12180 1727204089.92934: stderr chunk (state=3): >>><<< 12180 1727204089.92969: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:54:49.896949", "end": "2024-09-24 14:54:49.916299", "delta": "0:00:00.019350", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204089.93079: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204089.616577-15361-259083841372621/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204089.93082: _low_level_execute_command(): starting 12180 1727204089.93084: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204089.616577-15361-259083841372621/ > /dev/null 2>&1 && sleep 0' 12180 1727204089.93815: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204089.93834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204089.93851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.93879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.93926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204089.93943: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204089.93958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.93985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204089.93999: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204089.94011: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204089.94024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204089.94042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204089.94059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204089.94077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204089.94094: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204089.94108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204089.94211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204089.94237: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204089.94260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204089.94359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204089.96183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204089.96215: stderr chunk (state=3): >>><<< 12180 1727204089.96218: stdout chunk (state=3): >>><<< 12180 1727204089.96236: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204089.96243: handler run complete 12180 1727204089.96268: Evaluated conditional (False): False 12180 1727204089.96279: attempt loop complete, returning result 12180 1727204089.96282: _execute() done 12180 1727204089.96284: dumping result to json 12180 1727204089.96289: done dumping result, returning 12180 1727204089.96300: done running TaskExecutor() for managed-node1/TASK: Stop dnsmasq/radvd services [0affcd87-79f5-ccb1-55ae-0000000000c7] 12180 1727204089.96305: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000c7 12180 1727204089.96412: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000c7 12180 1727204089.96415: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.019350", "end": "2024-09-24 14:54:49.916299", "rc": 0, "start": "2024-09-24 14:54:49.896949" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 12180 1727204089.96547: no more pending results, returning what we have 12180 1727204089.96551: results queue empty 12180 1727204089.96552: checking for any_errors_fatal 12180 1727204089.96563: done checking for any_errors_fatal 12180 1727204089.96565: checking for max_fail_percentage 12180 1727204089.96567: done checking for max_fail_percentage 12180 1727204089.96568: checking to see if all hosts have failed and the running result is not ok 12180 1727204089.96570: done checking to see if all hosts have failed 12180 1727204089.96570: getting the remaining hosts for this loop 12180 1727204089.96572: done getting the remaining hosts for this loop 12180 1727204089.96576: getting the next task for host managed-node1 12180 1727204089.96586: done getting next task for host managed-node1 12180 1727204089.96589: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 12180 1727204089.96592: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204089.96601: getting variables 12180 1727204089.96603: in VariableManager get_vars() 12180 1727204089.96646: Calling all_inventory to load vars for managed-node1 12180 1727204089.96649: Calling groups_inventory to load vars for managed-node1 12180 1727204089.96651: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204089.96666: Calling all_plugins_play to load vars for managed-node1 12180 1727204089.96669: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204089.96672: Calling groups_plugins_play to load vars for managed-node1 12180 1727204089.98278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204090.00208: done with get_vars() 12180 1727204090.00238: done getting variables 12180 1727204090.00308: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:131 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.475) 0:00:37.415 ***** 12180 1727204090.00340: entering _queue_task() for managed-node1/command 12180 1727204090.00686: worker is 1 (out of 1 available) 12180 1727204090.00699: exiting _queue_task() for managed-node1/command 12180 1727204090.00711: done queuing things up, now waiting for results queue to drain 12180 1727204090.00713: waiting for pending results... 12180 1727204090.01020: running TaskExecutor() for managed-node1/TASK: Restore the /etc/resolv.conf for initscript 12180 1727204090.01144: in run() - task 0affcd87-79f5-ccb1-55ae-0000000000c8 12180 1727204090.01169: variable 'ansible_search_path' from source: unknown 12180 1727204090.01212: calling self._execute() 12180 1727204090.01315: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204090.01319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204090.01333: variable 'omit' from source: magic vars 12180 1727204090.01609: variable 'ansible_distribution_major_version' from source: facts 12180 1727204090.01620: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204090.01702: variable 'network_provider' from source: set_fact 12180 1727204090.01707: Evaluated conditional (network_provider == "initscripts"): False 12180 1727204090.01711: when evaluation is False, skipping this task 12180 1727204090.01713: _execute() done 12180 1727204090.01716: dumping result to json 12180 1727204090.01718: done dumping result, returning 12180 1727204090.01725: done running TaskExecutor() for managed-node1/TASK: Restore the /etc/resolv.conf for initscript [0affcd87-79f5-ccb1-55ae-0000000000c8] 12180 1727204090.01733: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000c8 12180 1727204090.01821: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000c8 12180 1727204090.01824: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12180 1727204090.01872: no more pending results, returning what we have 12180 1727204090.01875: results queue empty 12180 1727204090.01876: checking for any_errors_fatal 12180 1727204090.01887: done checking for any_errors_fatal 12180 1727204090.01888: checking for max_fail_percentage 12180 1727204090.01890: done checking for max_fail_percentage 12180 1727204090.01891: checking to see if all hosts have failed and the running result is not ok 12180 1727204090.01891: done checking to see if all hosts have failed 12180 1727204090.01892: getting the remaining hosts for this loop 12180 1727204090.01893: done getting the remaining hosts for this loop 12180 1727204090.01897: getting the next task for host managed-node1 12180 1727204090.01905: done getting next task for host managed-node1 12180 1727204090.01908: ^ task is: TASK: Verify network state restored to default 12180 1727204090.01911: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204090.01915: getting variables 12180 1727204090.01917: in VariableManager get_vars() 12180 1727204090.01958: Calling all_inventory to load vars for managed-node1 12180 1727204090.01961: Calling groups_inventory to load vars for managed-node1 12180 1727204090.01963: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204090.01976: Calling all_plugins_play to load vars for managed-node1 12180 1727204090.01978: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204090.01981: Calling groups_plugins_play to load vars for managed-node1 12180 1727204090.02784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204090.04330: done with get_vars() 12180 1727204090.04357: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:136 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.040) 0:00:37.456 ***** 12180 1727204090.04430: entering _queue_task() for managed-node1/include_tasks 12180 1727204090.04660: worker is 1 (out of 1 available) 12180 1727204090.04676: exiting _queue_task() for managed-node1/include_tasks 12180 1727204090.04689: done queuing things up, now waiting for results queue to drain 12180 1727204090.04691: waiting for pending results... 12180 1727204090.04880: running TaskExecutor() for managed-node1/TASK: Verify network state restored to default 12180 1727204090.04963: in run() - task 0affcd87-79f5-ccb1-55ae-0000000000c9 12180 1727204090.04975: variable 'ansible_search_path' from source: unknown 12180 1727204090.05004: calling self._execute() 12180 1727204090.05085: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204090.05089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204090.05097: variable 'omit' from source: magic vars 12180 1727204090.05382: variable 'ansible_distribution_major_version' from source: facts 12180 1727204090.05392: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204090.05400: _execute() done 12180 1727204090.05403: dumping result to json 12180 1727204090.05406: done dumping result, returning 12180 1727204090.05411: done running TaskExecutor() for managed-node1/TASK: Verify network state restored to default [0affcd87-79f5-ccb1-55ae-0000000000c9] 12180 1727204090.05420: sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000c9 12180 1727204090.05509: done sending task result for task 0affcd87-79f5-ccb1-55ae-0000000000c9 12180 1727204090.05512: WORKER PROCESS EXITING 12180 1727204090.05540: no more pending results, returning what we have 12180 1727204090.05545: in VariableManager get_vars() 12180 1727204090.05594: Calling all_inventory to load vars for managed-node1 12180 1727204090.05597: Calling groups_inventory to load vars for managed-node1 12180 1727204090.05600: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204090.05614: Calling all_plugins_play to load vars for managed-node1 12180 1727204090.05616: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204090.05619: Calling groups_plugins_play to load vars for managed-node1 12180 1727204090.06726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204090.08310: done with get_vars() 12180 1727204090.08332: variable 'ansible_search_path' from source: unknown 12180 1727204090.08348: we have included files to process 12180 1727204090.08349: generating all_blocks data 12180 1727204090.08351: done generating all_blocks data 12180 1727204090.08357: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 12180 1727204090.08358: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 12180 1727204090.08360: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 12180 1727204090.08696: done processing included file 12180 1727204090.08697: iterating over new_blocks loaded from include file 12180 1727204090.08698: in VariableManager get_vars() 12180 1727204090.08711: done with get_vars() 12180 1727204090.08712: filtering new block on tags 12180 1727204090.08736: done filtering new block on tags 12180 1727204090.08738: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node1 12180 1727204090.08741: extending task lists for all hosts with included blocks 12180 1727204090.09483: done extending task lists 12180 1727204090.09484: done processing included files 12180 1727204090.09485: results queue empty 12180 1727204090.09486: checking for any_errors_fatal 12180 1727204090.09488: done checking for any_errors_fatal 12180 1727204090.09489: checking for max_fail_percentage 12180 1727204090.09489: done checking for max_fail_percentage 12180 1727204090.09490: checking to see if all hosts have failed and the running result is not ok 12180 1727204090.09491: done checking to see if all hosts have failed 12180 1727204090.09491: getting the remaining hosts for this loop 12180 1727204090.09492: done getting the remaining hosts for this loop 12180 1727204090.09493: getting the next task for host managed-node1 12180 1727204090.09496: done getting next task for host managed-node1 12180 1727204090.09497: ^ task is: TASK: Check routes and DNS 12180 1727204090.09499: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204090.09501: getting variables 12180 1727204090.09502: in VariableManager get_vars() 12180 1727204090.09512: Calling all_inventory to load vars for managed-node1 12180 1727204090.09514: Calling groups_inventory to load vars for managed-node1 12180 1727204090.09515: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204090.09519: Calling all_plugins_play to load vars for managed-node1 12180 1727204090.09521: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204090.09522: Calling groups_plugins_play to load vars for managed-node1 12180 1727204090.10225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204090.12123: done with get_vars() 12180 1727204090.12153: done getting variables 12180 1727204090.12205: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.078) 0:00:37.534 ***** 12180 1727204090.12236: entering _queue_task() for managed-node1/shell 12180 1727204090.12854: worker is 1 (out of 1 available) 12180 1727204090.12868: exiting _queue_task() for managed-node1/shell 12180 1727204090.12881: done queuing things up, now waiting for results queue to drain 12180 1727204090.12883: waiting for pending results... 12180 1727204090.13670: running TaskExecutor() for managed-node1/TASK: Check routes and DNS 12180 1727204090.13802: in run() - task 0affcd87-79f5-ccb1-55ae-000000000570 12180 1727204090.13834: variable 'ansible_search_path' from source: unknown 12180 1727204090.13844: variable 'ansible_search_path' from source: unknown 12180 1727204090.13889: calling self._execute() 12180 1727204090.14005: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204090.14020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204090.14046: variable 'omit' from source: magic vars 12180 1727204090.14468: variable 'ansible_distribution_major_version' from source: facts 12180 1727204090.14494: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204090.14508: variable 'omit' from source: magic vars 12180 1727204090.14562: variable 'omit' from source: magic vars 12180 1727204090.14613: variable 'omit' from source: magic vars 12180 1727204090.14663: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204090.14716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204090.14748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204090.14774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204090.14791: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204090.14836: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204090.14846: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204090.14854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204090.14971: Set connection var ansible_pipelining to False 12180 1727204090.14981: Set connection var ansible_shell_type to sh 12180 1727204090.14991: Set connection var ansible_timeout to 10 12180 1727204090.14994: Set connection var ansible_connection to ssh 12180 1727204090.15000: Set connection var ansible_shell_executable to /bin/sh 12180 1727204090.15004: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204090.15031: variable 'ansible_shell_executable' from source: unknown 12180 1727204090.15035: variable 'ansible_connection' from source: unknown 12180 1727204090.15038: variable 'ansible_module_compression' from source: unknown 12180 1727204090.15041: variable 'ansible_shell_type' from source: unknown 12180 1727204090.15043: variable 'ansible_shell_executable' from source: unknown 12180 1727204090.15047: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204090.15049: variable 'ansible_pipelining' from source: unknown 12180 1727204090.15052: variable 'ansible_timeout' from source: unknown 12180 1727204090.15056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204090.15167: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204090.15179: variable 'omit' from source: magic vars 12180 1727204090.15182: starting attempt loop 12180 1727204090.15190: running the handler 12180 1727204090.15201: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204090.15229: _low_level_execute_command(): starting 12180 1727204090.15249: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204090.16015: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204090.16048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204090.16068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.16098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.16194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204090.16205: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204090.16219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.16245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204090.16269: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204090.16290: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204090.16311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.16348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.16416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204090.16420: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204090.16424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204090.16484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204090.18070: stdout chunk (state=3): >>>/root <<< 12180 1727204090.18179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204090.18236: stderr chunk (state=3): >>><<< 12180 1727204090.18242: stdout chunk (state=3): >>><<< 12180 1727204090.18271: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204090.18281: _low_level_execute_command(): starting 12180 1727204090.18290: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204090.1826982-15396-279544496615989 `" && echo ansible-tmp-1727204090.1826982-15396-279544496615989="` echo /root/.ansible/tmp/ansible-tmp-1727204090.1826982-15396-279544496615989 `" ) && sleep 0' 12180 1727204090.18746: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204090.18753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204090.18763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.18778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.18808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204090.18815: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204090.18825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.18835: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204090.18839: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204090.18851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.18854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.18863: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204090.18870: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204090.18879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.18927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204090.18944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204090.18949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204090.19023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204090.20879: stdout chunk (state=3): >>>ansible-tmp-1727204090.1826982-15396-279544496615989=/root/.ansible/tmp/ansible-tmp-1727204090.1826982-15396-279544496615989 <<< 12180 1727204090.21074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204090.21133: stderr chunk (state=3): >>><<< 12180 1727204090.21137: stdout chunk (state=3): >>><<< 12180 1727204090.21155: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204090.1826982-15396-279544496615989=/root/.ansible/tmp/ansible-tmp-1727204090.1826982-15396-279544496615989 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204090.21184: variable 'ansible_module_compression' from source: unknown 12180 1727204090.21233: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12180 1727204090.21268: variable 'ansible_facts' from source: unknown 12180 1727204090.21332: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204090.1826982-15396-279544496615989/AnsiballZ_command.py 12180 1727204090.21448: Sending initial data 12180 1727204090.21452: Sent initial data (156 bytes) 12180 1727204090.22147: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.22153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.22186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.22208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 12180 1727204090.22211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 12180 1727204090.22214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.22275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204090.22278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204090.22282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204090.22338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204090.24111: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204090.24158: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204090.24250: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmpbf4wg32m /root/.ansible/tmp/ansible-tmp-1727204090.1826982-15396-279544496615989/AnsiballZ_command.py <<< 12180 1727204090.24299: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204090.25134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204090.25243: stderr chunk (state=3): >>><<< 12180 1727204090.25246: stdout chunk (state=3): >>><<< 12180 1727204090.25263: done transferring module to remote 12180 1727204090.25274: _low_level_execute_command(): starting 12180 1727204090.25279: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204090.1826982-15396-279544496615989/ /root/.ansible/tmp/ansible-tmp-1727204090.1826982-15396-279544496615989/AnsiballZ_command.py && sleep 0' 12180 1727204090.25745: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204090.25752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.25779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found <<< 12180 1727204090.25792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found <<< 12180 1727204090.25802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.25844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204090.25858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204090.25872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204090.25925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204090.27679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204090.27727: stderr chunk (state=3): >>><<< 12180 1727204090.27731: stdout chunk (state=3): >>><<< 12180 1727204090.27747: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204090.27750: _low_level_execute_command(): starting 12180 1727204090.27755: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204090.1826982-15396-279544496615989/AnsiballZ_command.py && sleep 0' 12180 1727204090.28210: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.28221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.28249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.28262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.28310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204090.28322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204090.28398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204090.42803: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:8f:92:e7:c1:ab brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.148/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3277sec preferred_lft 3277sec\n inet6 fe80::108f:92ff:fee7:c1ab/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.148 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.148 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:54:50.418153", "end": "2024-09-24 14:54:50.427281", "delta": "0:00:00.009128", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12180 1727204090.43911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204090.43977: stderr chunk (state=3): >>><<< 12180 1727204090.43981: stdout chunk (state=3): >>><<< 12180 1727204090.44001: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:8f:92:e7:c1:ab brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.148/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3277sec preferred_lft 3277sec\n inet6 fe80::108f:92ff:fee7:c1ab/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.148 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.148 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:54:50.418153", "end": "2024-09-24 14:54:50.427281", "delta": "0:00:00.009128", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204090.44039: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204090.1826982-15396-279544496615989/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204090.44046: _low_level_execute_command(): starting 12180 1727204090.44051: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204090.1826982-15396-279544496615989/ > /dev/null 2>&1 && sleep 0' 12180 1727204090.44513: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.44517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.44553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.44559: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204090.44569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.44580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.44585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.44636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204090.44651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204090.44668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204090.44721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204090.46505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204090.46566: stderr chunk (state=3): >>><<< 12180 1727204090.46570: stdout chunk (state=3): >>><<< 12180 1727204090.46590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204090.46599: handler run complete 12180 1727204090.46616: Evaluated conditional (False): False 12180 1727204090.46624: attempt loop complete, returning result 12180 1727204090.46627: _execute() done 12180 1727204090.46629: dumping result to json 12180 1727204090.46638: done dumping result, returning 12180 1727204090.46645: done running TaskExecutor() for managed-node1/TASK: Check routes and DNS [0affcd87-79f5-ccb1-55ae-000000000570] 12180 1727204090.46650: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000570 12180 1727204090.46752: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000570 12180 1727204090.46754: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009128", "end": "2024-09-24 14:54:50.427281", "rc": 0, "start": "2024-09-24 14:54:50.418153" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:8f:92:e7:c1:ab brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.148/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3277sec preferred_lft 3277sec inet6 fe80::108f:92ff:fee7:c1ab/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.148 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.148 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 12180 1727204090.46822: no more pending results, returning what we have 12180 1727204090.46826: results queue empty 12180 1727204090.46827: checking for any_errors_fatal 12180 1727204090.46828: done checking for any_errors_fatal 12180 1727204090.46829: checking for max_fail_percentage 12180 1727204090.46831: done checking for max_fail_percentage 12180 1727204090.46832: checking to see if all hosts have failed and the running result is not ok 12180 1727204090.46833: done checking to see if all hosts have failed 12180 1727204090.46833: getting the remaining hosts for this loop 12180 1727204090.46835: done getting the remaining hosts for this loop 12180 1727204090.46838: getting the next task for host managed-node1 12180 1727204090.46845: done getting next task for host managed-node1 12180 1727204090.46847: ^ task is: TASK: Verify DNS and network connectivity 12180 1727204090.46851: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 12180 1727204090.46860: getting variables 12180 1727204090.46861: in VariableManager get_vars() 12180 1727204090.46903: Calling all_inventory to load vars for managed-node1 12180 1727204090.46905: Calling groups_inventory to load vars for managed-node1 12180 1727204090.46908: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204090.46919: Calling all_plugins_play to load vars for managed-node1 12180 1727204090.46921: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204090.46923: Calling groups_plugins_play to load vars for managed-node1 12180 1727204090.47830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204090.49429: done with get_vars() 12180 1727204090.49458: done getting variables 12180 1727204090.49525: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:54:50 -0400 (0:00:00.373) 0:00:37.907 ***** 12180 1727204090.49555: entering _queue_task() for managed-node1/shell 12180 1727204090.50007: worker is 1 (out of 1 available) 12180 1727204090.50023: exiting _queue_task() for managed-node1/shell 12180 1727204090.50039: done queuing things up, now waiting for results queue to drain 12180 1727204090.50041: waiting for pending results... 12180 1727204090.50254: running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity 12180 1727204090.50341: in run() - task 0affcd87-79f5-ccb1-55ae-000000000571 12180 1727204090.50351: variable 'ansible_search_path' from source: unknown 12180 1727204090.50355: variable 'ansible_search_path' from source: unknown 12180 1727204090.50386: calling self._execute() 12180 1727204090.50468: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204090.50472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204090.50481: variable 'omit' from source: magic vars 12180 1727204090.50762: variable 'ansible_distribution_major_version' from source: facts 12180 1727204090.50774: Evaluated conditional (ansible_distribution_major_version != '6'): True 12180 1727204090.50873: variable 'ansible_facts' from source: unknown 12180 1727204090.51365: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 12180 1727204090.51371: variable 'omit' from source: magic vars 12180 1727204090.51408: variable 'omit' from source: magic vars 12180 1727204090.51430: variable 'omit' from source: magic vars 12180 1727204090.51467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12180 1727204090.51499: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12180 1727204090.51515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12180 1727204090.51532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204090.51543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12180 1727204090.51571: variable 'inventory_hostname' from source: host vars for 'managed-node1' 12180 1727204090.51574: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204090.51577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204090.51648: Set connection var ansible_pipelining to False 12180 1727204090.51651: Set connection var ansible_shell_type to sh 12180 1727204090.51656: Set connection var ansible_timeout to 10 12180 1727204090.51661: Set connection var ansible_connection to ssh 12180 1727204090.51677: Set connection var ansible_shell_executable to /bin/sh 12180 1727204090.51687: Set connection var ansible_module_compression to ZIP_DEFLATED 12180 1727204090.51758: variable 'ansible_shell_executable' from source: unknown 12180 1727204090.51762: variable 'ansible_connection' from source: unknown 12180 1727204090.51768: variable 'ansible_module_compression' from source: unknown 12180 1727204090.51788: variable 'ansible_shell_type' from source: unknown 12180 1727204090.51791: variable 'ansible_shell_executable' from source: unknown 12180 1727204090.51794: variable 'ansible_host' from source: host vars for 'managed-node1' 12180 1727204090.51825: variable 'ansible_pipelining' from source: unknown 12180 1727204090.51829: variable 'ansible_timeout' from source: unknown 12180 1727204090.51834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 12180 1727204090.51971: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204090.51987: variable 'omit' from source: magic vars 12180 1727204090.51996: starting attempt loop 12180 1727204090.52005: running the handler 12180 1727204090.52022: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12180 1727204090.52045: _low_level_execute_command(): starting 12180 1727204090.52057: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12180 1727204090.52857: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.52862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.52900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.52915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204090.52928: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204090.52938: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204090.52949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204090.52961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.52986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.52999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204090.53010: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204090.53023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.53107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204090.53130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204090.53148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204090.53324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204090.54889: stdout chunk (state=3): >>>/root <<< 12180 1727204090.54991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204090.55043: stderr chunk (state=3): >>><<< 12180 1727204090.55046: stdout chunk (state=3): >>><<< 12180 1727204090.55069: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204090.55082: _low_level_execute_command(): starting 12180 1727204090.55093: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204090.5507057-15415-97172837435811 `" && echo ansible-tmp-1727204090.5507057-15415-97172837435811="` echo /root/.ansible/tmp/ansible-tmp-1727204090.5507057-15415-97172837435811 `" ) && sleep 0' 12180 1727204090.55649: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204090.55653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204090.55656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.55658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.55702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204090.55709: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204090.55723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.55736: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204090.55743: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204090.55756: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204090.55759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204090.55767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.55778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.55785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204090.55791: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204090.55808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.55882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204090.55913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204090.55931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204090.56021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204090.57901: stdout chunk (state=3): >>>ansible-tmp-1727204090.5507057-15415-97172837435811=/root/.ansible/tmp/ansible-tmp-1727204090.5507057-15415-97172837435811 <<< 12180 1727204090.58007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204090.58065: stderr chunk (state=3): >>><<< 12180 1727204090.58071: stdout chunk (state=3): >>><<< 12180 1727204090.58088: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204090.5507057-15415-97172837435811=/root/.ansible/tmp/ansible-tmp-1727204090.5507057-15415-97172837435811 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204090.58114: variable 'ansible_module_compression' from source: unknown 12180 1727204090.58157: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-12180cbnqllfr/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12180 1727204090.58192: variable 'ansible_facts' from source: unknown 12180 1727204090.58255: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204090.5507057-15415-97172837435811/AnsiballZ_command.py 12180 1727204090.58368: Sending initial data 12180 1727204090.58379: Sent initial data (155 bytes) 12180 1727204090.59068: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204090.59072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.59075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.59108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.59112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.59114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.59165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204090.59180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204090.59346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204090.61040: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12180 1727204090.61094: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 12180 1727204090.61098: stderr chunk (state=3): >>>debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 12180 1727204090.61188: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-12180cbnqllfr/tmppktfcdvk /root/.ansible/tmp/ansible-tmp-1727204090.5507057-15415-97172837435811/AnsiballZ_command.py <<< 12180 1727204090.61238: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 12180 1727204090.62085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204090.62195: stderr chunk (state=3): >>><<< 12180 1727204090.62199: stdout chunk (state=3): >>><<< 12180 1727204090.62216: done transferring module to remote 12180 1727204090.62225: _low_level_execute_command(): starting 12180 1727204090.62233: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204090.5507057-15415-97172837435811/ /root/.ansible/tmp/ansible-tmp-1727204090.5507057-15415-97172837435811/AnsiballZ_command.py && sleep 0' 12180 1727204090.62699: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204090.62705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.62740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204090.62752: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.62758: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204090.62766: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration <<< 12180 1727204090.62774: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204090.62780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.62788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.62796: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204090.62801: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204090.62806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.62860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204090.62876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204090.62952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204090.64680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204090.64729: stderr chunk (state=3): >>><<< 12180 1727204090.64734: stdout chunk (state=3): >>><<< 12180 1727204090.64746: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204090.64751: _low_level_execute_command(): starting 12180 1727204090.64753: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204090.5507057-15415-97172837435811/AnsiballZ_command.py && sleep 0' 12180 1727204090.65213: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204090.65226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204090.65251: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.65266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204090.65315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204090.65330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204090.65403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204091.06888: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1399 0 --:--:-- --:--:-- --:--:-- 1399\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 6613 0 --:--:-- --:--:-- --:--:-- 6466", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:54:50.783309", "end": "2024-09-24 14:54:51.068163", "delta": "0:00:00.284854", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12180 1727204091.08189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. <<< 12180 1727204091.08239: stderr chunk (state=3): >>><<< 12180 1727204091.08243: stdout chunk (state=3): >>><<< 12180 1727204091.08270: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1399 0 --:--:-- --:--:-- --:--:-- 1399\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 6613 0 --:--:-- --:--:-- --:--:-- 6466", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:54:50.783309", "end": "2024-09-24 14:54:51.068163", "delta": "0:00:00.284854", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.148 closed. 12180 1727204091.08404: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204090.5507057-15415-97172837435811/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12180 1727204091.08408: _low_level_execute_command(): starting 12180 1727204091.08411: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204090.5507057-15415-97172837435811/ > /dev/null 2>&1 && sleep 0' 12180 1727204091.09009: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12180 1727204091.09025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204091.09041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204091.09069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204091.09955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204091.09973: stderr chunk (state=3): >>>debug2: match not found <<< 12180 1727204091.10031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204091.10051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12180 1727204091.10065: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.148 is address <<< 12180 1727204091.10078: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12180 1727204091.10091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12180 1727204091.10106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12180 1727204091.10129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12180 1727204091.10143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 <<< 12180 1727204091.10155: stderr chunk (state=3): >>>debug2: match found <<< 12180 1727204091.10173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12180 1727204091.10328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 12180 1727204091.10377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12180 1727204091.10468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12180 1727204091.10559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12180 1727204091.12442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12180 1727204091.12450: stdout chunk (state=3): >>><<< 12180 1727204091.12453: stderr chunk (state=3): >>><<< 12180 1727204091.12584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.148 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.148 originally 10.31.9.148 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12180 1727204091.12588: handler run complete 12180 1727204091.12591: Evaluated conditional (False): False 12180 1727204091.12593: attempt loop complete, returning result 12180 1727204091.12595: _execute() done 12180 1727204091.12597: dumping result to json 12180 1727204091.12599: done dumping result, returning 12180 1727204091.12601: done running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity [0affcd87-79f5-ccb1-55ae-000000000571] 12180 1727204091.12603: sending task result for task 0affcd87-79f5-ccb1-55ae-000000000571 12180 1727204091.12744: done sending task result for task 0affcd87-79f5-ccb1-55ae-000000000571 12180 1727204091.12747: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.284854", "end": "2024-09-24 14:54:51.068163", "rc": 0, "start": "2024-09-24 14:54:50.783309" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1399 0 --:--:-- --:--:-- --:--:-- 1399 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 6613 0 --:--:-- --:--:-- --:--:-- 6466 12180 1727204091.12827: no more pending results, returning what we have 12180 1727204091.12831: results queue empty 12180 1727204091.12832: checking for any_errors_fatal 12180 1727204091.12844: done checking for any_errors_fatal 12180 1727204091.12845: checking for max_fail_percentage 12180 1727204091.12847: done checking for max_fail_percentage 12180 1727204091.12848: checking to see if all hosts have failed and the running result is not ok 12180 1727204091.12849: done checking to see if all hosts have failed 12180 1727204091.12849: getting the remaining hosts for this loop 12180 1727204091.12851: done getting the remaining hosts for this loop 12180 1727204091.12855: getting the next task for host managed-node1 12180 1727204091.12874: done getting next task for host managed-node1 12180 1727204091.12877: ^ task is: TASK: meta (flush_handlers) 12180 1727204091.12879: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204091.12885: getting variables 12180 1727204091.12887: in VariableManager get_vars() 12180 1727204091.12931: Calling all_inventory to load vars for managed-node1 12180 1727204091.12934: Calling groups_inventory to load vars for managed-node1 12180 1727204091.12936: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204091.12949: Calling all_plugins_play to load vars for managed-node1 12180 1727204091.12952: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204091.12955: Calling groups_plugins_play to load vars for managed-node1 12180 1727204091.15850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204091.19152: done with get_vars() 12180 1727204091.19186: done getting variables 12180 1727204091.19260: in VariableManager get_vars() 12180 1727204091.19279: Calling all_inventory to load vars for managed-node1 12180 1727204091.19282: Calling groups_inventory to load vars for managed-node1 12180 1727204091.19284: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204091.19289: Calling all_plugins_play to load vars for managed-node1 12180 1727204091.19292: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204091.19294: Calling groups_plugins_play to load vars for managed-node1 12180 1727204091.20489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204091.23747: done with get_vars() 12180 1727204091.23792: done queuing things up, now waiting for results queue to drain 12180 1727204091.23794: results queue empty 12180 1727204091.23795: checking for any_errors_fatal 12180 1727204091.23800: done checking for any_errors_fatal 12180 1727204091.23800: checking for max_fail_percentage 12180 1727204091.23801: done checking for max_fail_percentage 12180 1727204091.23802: checking to see if all hosts have failed and the running result is not ok 12180 1727204091.23803: done checking to see if all hosts have failed 12180 1727204091.23804: getting the remaining hosts for this loop 12180 1727204091.23805: done getting the remaining hosts for this loop 12180 1727204091.23808: getting the next task for host managed-node1 12180 1727204091.23812: done getting next task for host managed-node1 12180 1727204091.23813: ^ task is: TASK: meta (flush_handlers) 12180 1727204091.23815: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204091.23822: getting variables 12180 1727204091.23823: in VariableManager get_vars() 12180 1727204091.23839: Calling all_inventory to load vars for managed-node1 12180 1727204091.23841: Calling groups_inventory to load vars for managed-node1 12180 1727204091.23843: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204091.23849: Calling all_plugins_play to load vars for managed-node1 12180 1727204091.23851: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204091.23854: Calling groups_plugins_play to load vars for managed-node1 12180 1727204091.25388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204091.28373: done with get_vars() 12180 1727204091.28452: done getting variables 12180 1727204091.28541: in VariableManager get_vars() 12180 1727204091.28571: Calling all_inventory to load vars for managed-node1 12180 1727204091.28574: Calling groups_inventory to load vars for managed-node1 12180 1727204091.28577: Calling all_plugins_inventory to load vars for managed-node1 12180 1727204091.28585: Calling all_plugins_play to load vars for managed-node1 12180 1727204091.28588: Calling groups_plugins_inventory to load vars for managed-node1 12180 1727204091.28592: Calling groups_plugins_play to load vars for managed-node1 12180 1727204091.29334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12180 1727204091.31233: done with get_vars() 12180 1727204091.31272: done queuing things up, now waiting for results queue to drain 12180 1727204091.31274: results queue empty 12180 1727204091.31275: checking for any_errors_fatal 12180 1727204091.31277: done checking for any_errors_fatal 12180 1727204091.31278: checking for max_fail_percentage 12180 1727204091.31279: done checking for max_fail_percentage 12180 1727204091.31279: checking to see if all hosts have failed and the running result is not ok 12180 1727204091.31280: done checking to see if all hosts have failed 12180 1727204091.31281: getting the remaining hosts for this loop 12180 1727204091.31282: done getting the remaining hosts for this loop 12180 1727204091.31285: getting the next task for host managed-node1 12180 1727204091.31289: done getting next task for host managed-node1 12180 1727204091.31289: ^ task is: None 12180 1727204091.31291: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12180 1727204091.31293: done queuing things up, now waiting for results queue to drain 12180 1727204091.31293: results queue empty 12180 1727204091.31294: checking for any_errors_fatal 12180 1727204091.31295: done checking for any_errors_fatal 12180 1727204091.31295: checking for max_fail_percentage 12180 1727204091.31296: done checking for max_fail_percentage 12180 1727204091.31297: checking to see if all hosts have failed and the running result is not ok 12180 1727204091.31298: done checking to see if all hosts have failed 12180 1727204091.31300: getting the next task for host managed-node1 12180 1727204091.31303: done getting next task for host managed-node1 12180 1727204091.31303: ^ task is: None 12180 1727204091.31305: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node1 : ok=76 changed=3 unreachable=0 failed=0 skipped=60 rescued=0 ignored=0 Tuesday 24 September 2024 14:54:51 -0400 (0:00:00.818) 0:00:38.726 ***** =============================================================================== Install dnsmasq --------------------------------------------------------- 3.62s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Create test interfaces -------------------------------------------------- 2.25s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Gathering Facts --------------------------------------------------------- 2.08s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 1.94s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.65s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install pgrep, sysctl --------------------------------------------------- 1.43s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.39s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.31s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Check if system is ostree ----------------------------------------------- 1.19s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.00s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.95s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.90s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather the minimum subset of ansible_facts required by the network role test --- 0.84s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Verify DNS and network connectivity ------------------------------------- 0.82s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.75s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.51s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.50s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 ** TEST check polling interval ------------------------------------------ 0.49s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:75 Get stat for interface test1 -------------------------------------------- 0.49s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 12180 1727204091.31450: RUNNING CLEANUP