[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 29922 1726853650.74257: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Qi7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 29922 1726853650.74677: Added group all to inventory 29922 1726853650.74679: Added group ungrouped to inventory 29922 1726853650.74683: Group all now contains ungrouped 29922 1726853650.74686: Examining possible inventory source: /tmp/network-iHm/inventory.yml 29922 1726853650.85823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 29922 1726853650.85867: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 29922 1726853650.85885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 29922 1726853650.85926: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 29922 1726853650.85977: Loaded config def from plugin (inventory/script) 29922 1726853650.85978: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 29922 1726853650.86005: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 29922 1726853650.86064: Loaded config def from plugin (inventory/yaml) 29922 1726853650.86066: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 29922 1726853650.86129: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 29922 1726853650.86407: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 29922 1726853650.86410: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 29922 1726853650.86412: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 29922 1726853650.86416: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 29922 1726853650.86419: Loading data from /tmp/network-iHm/inventory.yml 29922 1726853650.86464: /tmp/network-iHm/inventory.yml was not parsable by auto 29922 1726853650.86507: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 29922 1726853650.86534: Loading data from /tmp/network-iHm/inventory.yml 29922 1726853650.86591: group all already in inventory 29922 1726853650.86596: set inventory_file for managed_node1 29922 1726853650.86599: set inventory_dir for managed_node1 29922 1726853650.86600: Added host managed_node1 to inventory 29922 1726853650.86602: Added host managed_node1 to group all 29922 1726853650.86602: set ansible_host for managed_node1 29922 1726853650.86603: set ansible_ssh_extra_args for managed_node1 29922 1726853650.86605: set inventory_file for managed_node2 29922 1726853650.86606: set inventory_dir for managed_node2 29922 1726853650.86607: Added host managed_node2 to inventory 29922 1726853650.86608: Added host managed_node2 to group all 29922 1726853650.86608: set ansible_host for managed_node2 29922 1726853650.86609: set ansible_ssh_extra_args for managed_node2 29922 1726853650.86610: set inventory_file for managed_node3 29922 1726853650.86612: set inventory_dir for managed_node3 29922 1726853650.86612: Added host managed_node3 to inventory 29922 1726853650.86613: Added host managed_node3 to group all 29922 1726853650.86613: set ansible_host for managed_node3 29922 1726853650.86614: set ansible_ssh_extra_args for managed_node3 29922 1726853650.86615: Reconcile groups and hosts in inventory. 29922 1726853650.86618: Group ungrouped now contains managed_node1 29922 1726853650.86619: Group ungrouped now contains managed_node2 29922 1726853650.86620: Group ungrouped now contains managed_node3 29922 1726853650.86677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 29922 1726853650.86753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 29922 1726853650.86788: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 29922 1726853650.86806: Loaded config def from plugin (vars/host_group_vars) 29922 1726853650.86807: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 29922 1726853650.86812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 29922 1726853650.86817: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 29922 1726853650.86845: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 29922 1726853650.87101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853650.87169: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 29922 1726853650.87195: Loaded config def from plugin (connection/local) 29922 1726853650.87197: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 29922 1726853650.87584: Loaded config def from plugin (connection/paramiko_ssh) 29922 1726853650.87586: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 29922 1726853650.88146: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 29922 1726853650.88173: Loaded config def from plugin (connection/psrp) 29922 1726853650.88175: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 29922 1726853650.88589: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 29922 1726853650.88612: Loaded config def from plugin (connection/ssh) 29922 1726853650.88614: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 29922 1726853650.89899: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 29922 1726853650.89922: Loaded config def from plugin (connection/winrm) 29922 1726853650.89924: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 29922 1726853650.89947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 29922 1726853650.89994: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 29922 1726853650.90032: Loaded config def from plugin (shell/cmd) 29922 1726853650.90034: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 29922 1726853650.90050: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 29922 1726853650.90093: Loaded config def from plugin (shell/powershell) 29922 1726853650.90095: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 29922 1726853650.90129: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 29922 1726853650.90236: Loaded config def from plugin (shell/sh) 29922 1726853650.90238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 29922 1726853650.90261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 29922 1726853650.90334: Loaded config def from plugin (become/runas) 29922 1726853650.90336: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 29922 1726853650.90447: Loaded config def from plugin (become/su) 29922 1726853650.90449: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 29922 1726853650.90546: Loaded config def from plugin (become/sudo) 29922 1726853650.90548: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 29922 1726853650.90575: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml 29922 1726853650.90789: in VariableManager get_vars() 29922 1726853650.90803: done with get_vars() 29922 1726853650.90892: trying /usr/local/lib/python3.12/site-packages/ansible/modules 29922 1726853650.92897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 29922 1726853650.92979: in VariableManager get_vars() 29922 1726853650.92982: done with get_vars() 29922 1726853650.92984: variable 'playbook_dir' from source: magic vars 29922 1726853650.92985: variable 'ansible_playbook_python' from source: magic vars 29922 1726853650.92985: variable 'ansible_config_file' from source: magic vars 29922 1726853650.92986: variable 'groups' from source: magic vars 29922 1726853650.92986: variable 'omit' from source: magic vars 29922 1726853650.92987: variable 'ansible_version' from source: magic vars 29922 1726853650.92987: variable 'ansible_check_mode' from source: magic vars 29922 1726853650.92988: variable 'ansible_diff_mode' from source: magic vars 29922 1726853650.92988: variable 'ansible_forks' from source: magic vars 29922 1726853650.92988: variable 'ansible_inventory_sources' from source: magic vars 29922 1726853650.92989: variable 'ansible_skip_tags' from source: magic vars 29922 1726853650.92989: variable 'ansible_limit' from source: magic vars 29922 1726853650.92990: variable 'ansible_run_tags' from source: magic vars 29922 1726853650.92990: variable 'ansible_verbosity' from source: magic vars 29922 1726853650.93013: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml 29922 1726853650.93746: in VariableManager get_vars() 29922 1726853650.93770: done with get_vars() 29922 1726853650.93810: in VariableManager get_vars() 29922 1726853650.93829: done with get_vars() 29922 1726853650.93869: in VariableManager get_vars() 29922 1726853650.93885: done with get_vars() 29922 1726853650.93961: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 29922 1726853650.94139: in VariableManager get_vars() 29922 1726853650.94154: done with get_vars() 29922 1726853650.94162: variable 'omit' from source: magic vars 29922 1726853650.94184: variable 'omit' from source: magic vars 29922 1726853650.94229: in VariableManager get_vars() 29922 1726853650.94242: done with get_vars() 29922 1726853650.94312: in VariableManager get_vars() 29922 1726853650.94330: done with get_vars() 29922 1726853650.94389: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 29922 1726853650.94644: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 29922 1726853650.94792: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 29922 1726853650.95777: in VariableManager get_vars() 29922 1726853650.95802: done with get_vars() 29922 1726853650.96310: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 29922 1726853651.00906: in VariableManager get_vars() 29922 1726853651.00909: done with get_vars() 29922 1726853651.00911: variable 'playbook_dir' from source: magic vars 29922 1726853651.00912: variable 'ansible_playbook_python' from source: magic vars 29922 1726853651.00912: variable 'ansible_config_file' from source: magic vars 29922 1726853651.00912: variable 'groups' from source: magic vars 29922 1726853651.00913: variable 'omit' from source: magic vars 29922 1726853651.00913: variable 'ansible_version' from source: magic vars 29922 1726853651.00914: variable 'ansible_check_mode' from source: magic vars 29922 1726853651.00914: variable 'ansible_diff_mode' from source: magic vars 29922 1726853651.00915: variable 'ansible_forks' from source: magic vars 29922 1726853651.00915: variable 'ansible_inventory_sources' from source: magic vars 29922 1726853651.00916: variable 'ansible_skip_tags' from source: magic vars 29922 1726853651.00916: variable 'ansible_limit' from source: magic vars 29922 1726853651.00917: variable 'ansible_run_tags' from source: magic vars 29922 1726853651.00917: variable 'ansible_verbosity' from source: magic vars 29922 1726853651.00938: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 29922 1726853651.00989: in VariableManager get_vars() 29922 1726853651.00991: done with get_vars() 29922 1726853651.00993: variable 'playbook_dir' from source: magic vars 29922 1726853651.00993: variable 'ansible_playbook_python' from source: magic vars 29922 1726853651.00994: variable 'ansible_config_file' from source: magic vars 29922 1726853651.00995: variable 'groups' from source: magic vars 29922 1726853651.00995: variable 'omit' from source: magic vars 29922 1726853651.00996: variable 'ansible_version' from source: magic vars 29922 1726853651.00997: variable 'ansible_check_mode' from source: magic vars 29922 1726853651.00997: variable 'ansible_diff_mode' from source: magic vars 29922 1726853651.00998: variable 'ansible_forks' from source: magic vars 29922 1726853651.00998: variable 'ansible_inventory_sources' from source: magic vars 29922 1726853651.00999: variable 'ansible_skip_tags' from source: magic vars 29922 1726853651.00999: variable 'ansible_limit' from source: magic vars 29922 1726853651.01000: variable 'ansible_run_tags' from source: magic vars 29922 1726853651.01000: variable 'ansible_verbosity' from source: magic vars 29922 1726853651.01020: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 29922 1726853651.01062: in VariableManager get_vars() 29922 1726853651.01072: done with get_vars() 29922 1726853651.01103: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 29922 1726853651.01173: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 29922 1726853651.01219: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 29922 1726853651.01442: in VariableManager get_vars() 29922 1726853651.01455: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 29922 1726853651.02482: in VariableManager get_vars() 29922 1726853651.02491: done with get_vars() 29922 1726853651.02515: in VariableManager get_vars() 29922 1726853651.02517: done with get_vars() 29922 1726853651.02519: variable 'playbook_dir' from source: magic vars 29922 1726853651.02519: variable 'ansible_playbook_python' from source: magic vars 29922 1726853651.02520: variable 'ansible_config_file' from source: magic vars 29922 1726853651.02520: variable 'groups' from source: magic vars 29922 1726853651.02521: variable 'omit' from source: magic vars 29922 1726853651.02521: variable 'ansible_version' from source: magic vars 29922 1726853651.02522: variable 'ansible_check_mode' from source: magic vars 29922 1726853651.02522: variable 'ansible_diff_mode' from source: magic vars 29922 1726853651.02523: variable 'ansible_forks' from source: magic vars 29922 1726853651.02523: variable 'ansible_inventory_sources' from source: magic vars 29922 1726853651.02523: variable 'ansible_skip_tags' from source: magic vars 29922 1726853651.02524: variable 'ansible_limit' from source: magic vars 29922 1726853651.02524: variable 'ansible_run_tags' from source: magic vars 29922 1726853651.02525: variable 'ansible_verbosity' from source: magic vars 29922 1726853651.02544: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 29922 1726853651.02598: in VariableManager get_vars() 29922 1726853651.02607: done with get_vars() 29922 1726853651.02656: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 29922 1726853651.02762: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 29922 1726853651.02835: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 29922 1726853651.03217: in VariableManager get_vars() 29922 1726853651.03236: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 29922 1726853651.04777: in VariableManager get_vars() 29922 1726853651.04792: done with get_vars() 29922 1726853651.04827: in VariableManager get_vars() 29922 1726853651.04839: done with get_vars() 29922 1726853651.04880: in VariableManager get_vars() 29922 1726853651.04893: done with get_vars() 29922 1726853651.04958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 29922 1726853651.04977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 29922 1726853651.05212: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 29922 1726853651.05381: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 29922 1726853651.05384: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 29922 1726853651.05415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 29922 1726853651.05442: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 29922 1726853651.05609: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 29922 1726853651.05673: Loaded config def from plugin (callback/default) 29922 1726853651.05675: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 29922 1726853651.06786: Loaded config def from plugin (callback/junit) 29922 1726853651.06788: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 29922 1726853651.06832: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 29922 1726853651.06898: Loaded config def from plugin (callback/minimal) 29922 1726853651.06901: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 29922 1726853651.06939: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 29922 1726853651.06999: Loaded config def from plugin (callback/tree) 29922 1726853651.07002: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 29922 1726853651.07122: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 29922 1726853651.07124: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_routing_rules_nm.yml ******************************************* 6 plays in /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml 29922 1726853651.07151: in VariableManager get_vars() 29922 1726853651.07164: done with get_vars() 29922 1726853651.07174: in VariableManager get_vars() 29922 1726853651.07185: done with get_vars() 29922 1726853651.07189: variable 'omit' from source: magic vars 29922 1726853651.07224: in VariableManager get_vars() 29922 1726853651.07238: done with get_vars() 29922 1726853651.07257: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_routing_rules.yml' with nm as provider] **** 29922 1726853651.07783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 29922 1726853651.07853: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 29922 1726853651.07885: getting the remaining hosts for this loop 29922 1726853651.07887: done getting the remaining hosts for this loop 29922 1726853651.07894: getting the next task for host managed_node3 29922 1726853651.07897: done getting next task for host managed_node3 29922 1726853651.07899: ^ task is: TASK: Gathering Facts 29922 1726853651.07901: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853651.07903: getting variables 29922 1726853651.07904: in VariableManager get_vars() 29922 1726853651.07913: Calling all_inventory to load vars for managed_node3 29922 1726853651.07915: Calling groups_inventory to load vars for managed_node3 29922 1726853651.07918: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853651.07930: Calling all_plugins_play to load vars for managed_node3 29922 1726853651.07941: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853651.07945: Calling groups_plugins_play to load vars for managed_node3 29922 1726853651.07990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853651.08043: done with get_vars() 29922 1726853651.08050: done getting variables 29922 1726853651.08246: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:6 Friday 20 September 2024 13:34:11 -0400 (0:00:00.012) 0:00:00.012 ****** 29922 1726853651.08270: entering _queue_task() for managed_node3/gather_facts 29922 1726853651.08274: Creating lock for gather_facts 29922 1726853651.08662: worker is 1 (out of 1 available) 29922 1726853651.08675: exiting _queue_task() for managed_node3/gather_facts 29922 1726853651.08687: done queuing things up, now waiting for results queue to drain 29922 1726853651.08689: waiting for pending results... 29922 1726853651.09005: running TaskExecutor() for managed_node3/TASK: Gathering Facts 29922 1726853651.09140: in run() - task 02083763-bbaf-51d4-513b-0000000000af 29922 1726853651.09144: variable 'ansible_search_path' from source: unknown 29922 1726853651.09147: calling self._execute() 29922 1726853651.09203: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853651.09216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853651.09230: variable 'omit' from source: magic vars 29922 1726853651.09337: variable 'omit' from source: magic vars 29922 1726853651.09374: variable 'omit' from source: magic vars 29922 1726853651.09414: variable 'omit' from source: magic vars 29922 1726853651.09467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853651.09509: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853651.09534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853651.09556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853651.09580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853651.09613: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853651.09622: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853651.09630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853651.09784: Set connection var ansible_connection to ssh 29922 1726853651.09787: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853651.09790: Set connection var ansible_shell_executable to /bin/sh 29922 1726853651.09792: Set connection var ansible_pipelining to False 29922 1726853651.09794: Set connection var ansible_timeout to 10 29922 1726853651.09796: Set connection var ansible_shell_type to sh 29922 1726853651.09807: variable 'ansible_shell_executable' from source: unknown 29922 1726853651.09814: variable 'ansible_connection' from source: unknown 29922 1726853651.09821: variable 'ansible_module_compression' from source: unknown 29922 1726853651.09827: variable 'ansible_shell_type' from source: unknown 29922 1726853651.09834: variable 'ansible_shell_executable' from source: unknown 29922 1726853651.09842: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853651.09850: variable 'ansible_pipelining' from source: unknown 29922 1726853651.09858: variable 'ansible_timeout' from source: unknown 29922 1726853651.09892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853651.10059: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853651.10078: variable 'omit' from source: magic vars 29922 1726853651.10089: starting attempt loop 29922 1726853651.10096: running the handler 29922 1726853651.10176: variable 'ansible_facts' from source: unknown 29922 1726853651.10180: _low_level_execute_command(): starting 29922 1726853651.10182: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853651.10975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853651.10992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853651.11010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853651.11111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853651.12820: stdout chunk (state=3): >>>/root <<< 29922 1726853651.12967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853651.12974: stdout chunk (state=3): >>><<< 29922 1726853651.12977: stderr chunk (state=3): >>><<< 29922 1726853651.13008: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853651.13104: _low_level_execute_command(): starting 29922 1726853651.13108: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853651.1301575-29945-263767858701362 `" && echo ansible-tmp-1726853651.1301575-29945-263767858701362="` echo /root/.ansible/tmp/ansible-tmp-1726853651.1301575-29945-263767858701362 `" ) && sleep 0' 29922 1726853651.13642: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853651.13687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853651.13698: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853651.13777: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853651.13803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853651.13828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853651.13861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853651.13956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853651.15981: stdout chunk (state=3): >>>ansible-tmp-1726853651.1301575-29945-263767858701362=/root/.ansible/tmp/ansible-tmp-1726853651.1301575-29945-263767858701362 <<< 29922 1726853651.16380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853651.16384: stdout chunk (state=3): >>><<< 29922 1726853651.16387: stderr chunk (state=3): >>><<< 29922 1726853651.16389: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853651.1301575-29945-263767858701362=/root/.ansible/tmp/ansible-tmp-1726853651.1301575-29945-263767858701362 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853651.16392: variable 'ansible_module_compression' from source: unknown 29922 1726853651.16394: ANSIBALLZ: Using generic lock for ansible.legacy.setup 29922 1726853651.16396: ANSIBALLZ: Acquiring lock 29922 1726853651.16398: ANSIBALLZ: Lock acquired: 140376041361328 29922 1726853651.16400: ANSIBALLZ: Creating module 29922 1726853651.77299: ANSIBALLZ: Writing module into payload 29922 1726853651.77561: ANSIBALLZ: Writing module 29922 1726853651.77777: ANSIBALLZ: Renaming module 29922 1726853651.77780: ANSIBALLZ: Done creating module 29922 1726853651.77782: variable 'ansible_facts' from source: unknown 29922 1726853651.77784: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853651.77786: _low_level_execute_command(): starting 29922 1726853651.77789: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 29922 1726853651.78952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853651.78969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853651.78987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853651.79006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853651.79022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853651.79082: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853651.79095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853651.79152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853651.79169: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853651.79194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853651.79298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853651.81027: stdout chunk (state=3): >>>PLATFORM <<< 29922 1726853651.81130: stdout chunk (state=3): >>>Linux <<< 29922 1726853651.81133: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 <<< 29922 1726853651.81135: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 29922 1726853651.81247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853651.81314: stderr chunk (state=3): >>><<< 29922 1726853651.81322: stdout chunk (state=3): >>><<< 29922 1726853651.81389: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853651.81404 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 29922 1726853651.81575: _low_level_execute_command(): starting 29922 1726853651.81611: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 29922 1726853651.81895: Sending initial data 29922 1726853651.81898: Sent initial data (1181 bytes) 29922 1726853651.82344: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853651.82384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853651.82409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853651.82494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853651.82525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853651.82620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853651.86123: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 29922 1726853651.86678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853651.86682: stdout chunk (state=3): >>><<< 29922 1726853651.86685: stderr chunk (state=3): >>><<< 29922 1726853651.86688: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853651.86690: variable 'ansible_facts' from source: unknown 29922 1726853651.86707: variable 'ansible_facts' from source: unknown 29922 1726853651.86722: variable 'ansible_module_compression' from source: unknown 29922 1726853651.86766: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 29922 1726853651.86810: variable 'ansible_facts' from source: unknown 29922 1726853651.86984: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853651.1301575-29945-263767858701362/AnsiballZ_setup.py 29922 1726853651.87274: Sending initial data 29922 1726853651.87278: Sent initial data (154 bytes) 29922 1726853651.87867: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853651.87973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853651.87977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853651.87980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853651.88188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853651.88275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853651.89937: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 29922 1726853651.89944: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 29922 1726853651.89952: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 29922 1726853651.89968: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853651.90048: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853651.90130: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp5zlv4mj4 /root/.ansible/tmp/ansible-tmp-1726853651.1301575-29945-263767858701362/AnsiballZ_setup.py <<< 29922 1726853651.90133: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853651.1301575-29945-263767858701362/AnsiballZ_setup.py" <<< 29922 1726853651.90199: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp5zlv4mj4" to remote "/root/.ansible/tmp/ansible-tmp-1726853651.1301575-29945-263767858701362/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853651.1301575-29945-263767858701362/AnsiballZ_setup.py" <<< 29922 1726853651.92241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853651.92245: stdout chunk (state=3): >>><<< 29922 1726853651.92247: stderr chunk (state=3): >>><<< 29922 1726853651.92249: done transferring module to remote 29922 1726853651.92251: _low_level_execute_command(): starting 29922 1726853651.92253: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853651.1301575-29945-263767858701362/ /root/.ansible/tmp/ansible-tmp-1726853651.1301575-29945-263767858701362/AnsiballZ_setup.py && sleep 0' 29922 1726853651.93165: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853651.93300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853651.93304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853651.93368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853651.95289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853651.95293: stdout chunk (state=3): >>><<< 29922 1726853651.95295: stderr chunk (state=3): >>><<< 29922 1726853651.95377: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853651.95381: _low_level_execute_command(): starting 29922 1726853651.95383: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853651.1301575-29945-263767858701362/AnsiballZ_setup.py && sleep 0' 29922 1726853651.95985: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853651.96001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853651.96016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853651.96061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 29922 1726853651.96077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853651.96178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853651.96198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853651.96217: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853651.96240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853651.96361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853651.98640: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 29922 1726853651.98697: stdout chunk (state=3): >>>import _imp # builtin <<< 29922 1726853651.98701: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 29922 1726853651.98769: stdout chunk (state=3): >>>import '_io' # <<< 29922 1726853651.98774: stdout chunk (state=3): >>>import 'marshal' # <<< 29922 1726853651.98795: stdout chunk (state=3): >>>import 'posix' # <<< 29922 1726853651.98838: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 29922 1726853651.98864: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 29922 1726853651.98879: stdout chunk (state=3): >>># installed zipimport hook <<< 29922 1726853651.98924: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 29922 1726853651.98956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 29922 1726853651.98968: stdout chunk (state=3): >>>import 'codecs' # <<< 29922 1726853651.99005: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 29922 1726853651.99035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62549684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254937b30> <<< 29922 1726853651.99075: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625496aa50> <<< 29922 1726853651.99091: stdout chunk (state=3): >>>import '_signal' # <<< 29922 1726853651.99122: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 29922 1726853651.99151: stdout chunk (state=3): >>>import 'io' # <<< 29922 1726853651.99183: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 29922 1726853651.99281: stdout chunk (state=3): >>>import '_collections_abc' # <<< 29922 1726853651.99325: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 29922 1726853651.99336: stdout chunk (state=3): >>>import 'os' # <<< 29922 1726853651.99361: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 29922 1726853651.99394: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 29922 1726853651.99418: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 29922 1726853651.99437: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625473d130> <<< 29922 1726853651.99514: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853651.99543: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625473dfa0> <<< 29922 1726853651.99546: stdout chunk (state=3): >>>import 'site' # <<< 29922 1726853651.99579: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 29922 1726853651.99957: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 29922 1726853651.99987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 29922 1726853651.99997: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853652.00016: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 29922 1726853652.00075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 29922 1726853652.00108: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 29922 1726853652.00111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 29922 1726853652.00144: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625477be00> <<< 29922 1726853652.00147: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 29922 1726853652.00180: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 29922 1726853652.00202: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625477bec0> <<< 29922 1726853652.00223: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 29922 1726853652.00237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 29922 1726853652.00255: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 29922 1726853652.00316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853652.00338: stdout chunk (state=3): >>>import 'itertools' # <<< 29922 1726853652.00356: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62547b37d0> <<< 29922 1726853652.00385: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62547b3e60> <<< 29922 1726853652.00409: stdout chunk (state=3): >>>import '_collections' # <<< 29922 1726853652.00454: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254793ad0> <<< 29922 1726853652.00467: stdout chunk (state=3): >>>import '_functools' # <<< 29922 1726853652.00492: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62547911f0> <<< 29922 1726853652.00594: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254778fb0> <<< 29922 1726853652.00617: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 29922 1726853652.00642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 29922 1726853652.00662: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 29922 1726853652.00684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 29922 1726853652.00714: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 29922 1726853652.00764: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62547d3770> <<< 29922 1726853652.00767: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62547d2390> <<< 29922 1726853652.00800: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254792090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62547d0bc0> <<< 29922 1726853652.00870: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 29922 1726853652.00895: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254808800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254778230> <<< 29922 1726853652.00905: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 29922 1726853652.00938: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853652.00960: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6254808cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254808b60> <<< 29922 1726853652.00992: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6254808ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254776d50> <<< 29922 1726853652.01035: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 29922 1726853652.01038: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853652.01063: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 29922 1726853652.01092: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 29922 1726853652.01114: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254809580> <<< 29922 1726853652.01139: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254809250> <<< 29922 1726853652.01166: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625480a480> <<< 29922 1726853652.01185: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 29922 1726853652.01209: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 29922 1726853652.01261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62548206b0> <<< 29922 1726853652.01310: stdout chunk (state=3): >>>import 'errno' # <<< 29922 1726853652.01329: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853652.01348: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6254821d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 29922 1726853652.01376: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 29922 1726853652.01390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254822c30> <<< 29922 1726853652.01436: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853652.01452: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6254823290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254822180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 29922 1726853652.01475: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 29922 1726853652.01526: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853652.01538: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6254823d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254823440> <<< 29922 1726853652.01569: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625480a4e0> <<< 29922 1726853652.01597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 29922 1726853652.01616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 29922 1726853652.01646: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 29922 1726853652.01665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 29922 1726853652.01694: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6254523bc0> <<< 29922 1726853652.01715: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 29922 1726853652.01764: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f625454c6b0> <<< 29922 1726853652.01785: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625454c410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f625454c6e0> <<< 29922 1726853652.01818: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 29922 1726853652.01829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 29922 1726853652.01895: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853652.02021: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f625454d010> <<< 29922 1726853652.02176: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853652.02180: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f625454da00> <<< 29922 1726853652.02215: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625454c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254521d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 29922 1726853652.02240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 29922 1726853652.02262: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 29922 1726853652.02275: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625454ee10> <<< 29922 1726853652.02324: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625454db50> <<< 29922 1726853652.02327: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625480abd0> <<< 29922 1726853652.02345: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 29922 1726853652.02406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853652.02438: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 29922 1726853652.02457: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 29922 1726853652.02493: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625457b140> <<< 29922 1726853652.02560: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 29922 1726853652.02577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853652.02596: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 29922 1726853652.02607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 29922 1726853652.02645: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625459b500> <<< 29922 1726853652.02669: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 29922 1726853652.02706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 29922 1726853652.02775: stdout chunk (state=3): >>>import 'ntpath' # <<< 29922 1726853652.02797: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62545fc260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 29922 1726853652.02835: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 29922 1726853652.02855: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 29922 1726853652.02903: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 29922 1726853652.02992: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62545fe9c0> <<< 29922 1726853652.03067: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62545fc380> <<< 29922 1726853652.03099: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62545c5280> <<< 29922 1726853652.03149: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253f29370> <<< 29922 1726853652.03152: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625459a300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625454fd40> <<< 29922 1726853652.03344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 29922 1726853652.03347: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f625459a420> <<< 29922 1726853652.03583: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ju1qv90p/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 29922 1726853652.03713: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.03741: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 29922 1726853652.03761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 29922 1726853652.03791: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 29922 1726853652.03880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 29922 1726853652.03903: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253f8eff0> import '_typing' # <<< 29922 1726853652.04097: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253f6dee0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253f6d040> <<< 29922 1726853652.04120: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # <<< 29922 1726853652.04160: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.04166: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.04197: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.04200: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 29922 1726853652.05620: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.06800: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253f8cec0> <<< 29922 1726853652.06846: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853652.06874: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 29922 1726853652.06897: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 29922 1726853652.06918: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853652.06944: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253fc2990> <<< 29922 1726853652.06963: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253fc2720> <<< 29922 1726853652.06998: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253fc2030> <<< 29922 1726853652.07015: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 29922 1726853652.07077: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253fc2480> <<< 29922 1726853652.07096: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253f8fc80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253fc3710> <<< 29922 1726853652.07124: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253fc3950> <<< 29922 1726853652.07142: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 29922 1726853652.07204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 29922 1726853652.07215: stdout chunk (state=3): >>>import '_locale' # <<< 29922 1726853652.07268: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253fc3e90> <<< 29922 1726853652.07299: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 29922 1726853652.07314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 29922 1726853652.07350: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e29b50> <<< 29922 1726853652.07404: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e2b800> <<< 29922 1726853652.07422: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 29922 1726853652.07454: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e2c200> <<< 29922 1726853652.07476: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 29922 1726853652.07521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 29922 1726853652.07540: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e2d100> <<< 29922 1726853652.07550: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 29922 1726853652.07588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 29922 1726853652.07614: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 29922 1726853652.07662: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e2fe00> <<< 29922 1726853652.07712: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f625457b0b0> <<< 29922 1726853652.07739: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e2e0c0> <<< 29922 1726853652.07762: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 29922 1726853652.07782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 29922 1726853652.07813: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 29922 1726853652.07818: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 29922 1726853652.07963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 29922 1726853652.07980: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e37d40> <<< 29922 1726853652.08000: stdout chunk (state=3): >>>import '_tokenize' # <<< 29922 1726853652.08078: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e36810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e36570> <<< 29922 1726853652.08099: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 29922 1726853652.08168: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e36ae0> <<< 29922 1726853652.08219: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e2e5d0> <<< 29922 1726853652.08239: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e7ba10> <<< 29922 1726853652.08274: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e7c1a0> <<< 29922 1726853652.08282: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 29922 1726853652.08327: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 29922 1726853652.08330: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 29922 1726853652.08367: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853652.08384: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e7dc10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e7d9d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 29922 1726853652.08424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 29922 1726853652.08478: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e801a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e7e2d0> <<< 29922 1726853652.08495: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 29922 1726853652.08595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853652.08598: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 29922 1726853652.08631: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e83950> <<< 29922 1726853652.08752: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e80350> <<< 29922 1726853652.08816: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e847d0> <<< 29922 1726853652.08851: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e849b0> <<< 29922 1726853652.08917: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e84cb0> <<< 29922 1726853652.08921: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e7c380> <<< 29922 1726853652.08956: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 29922 1726853652.08961: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 29922 1726853652.09001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 29922 1726853652.09008: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853652.09030: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253d10380> <<< 29922 1726853652.09194: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853652.09231: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253d116a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e86b10> <<< 29922 1726853652.09260: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e87e90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e86720> <<< 29922 1726853652.09279: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 29922 1726853652.09295: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.09378: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.09485: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29922 1726853652.09514: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 29922 1726853652.09538: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.09646: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.09767: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.10493: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.10906: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 29922 1726853652.10965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853652.10979: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253d15970> <<< 29922 1726853652.11069: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253d16780> <<< 29922 1726853652.11143: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253d118e0> <<< 29922 1726853652.11183: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # <<< 29922 1726853652.11193: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.11343: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.11489: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 29922 1726853652.11518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253d16930> # zipimport: zlib available <<< 29922 1726853652.11966: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.12403: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.12481: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.12554: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 29922 1726853652.12566: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.12602: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.12636: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 29922 1726853652.12650: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.12711: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.12921: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 29922 1726853652.12925: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.12994: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 29922 1726853652.13200: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.13809: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253d17b60> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 29922 1726853652.13813: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.13841: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 29922 1726853652.13879: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.14050: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853652.14290: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253d22330> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253d1dd90> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available <<< 29922 1726853652.14350: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.14374: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.14419: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853652.14439: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 29922 1726853652.14487: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 29922 1726853652.14493: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 29922 1726853652.14588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 29922 1726853652.14726: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e0ab70> <<< 29922 1726853652.14739: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253efe840> <<< 29922 1726853652.14772: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253d22030> <<< 29922 1726853652.14799: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253d154f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 29922 1726853652.14846: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.14850: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 29922 1726853652.15015: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 29922 1726853652.15042: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.15208: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.15212: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 29922 1726853652.15214: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.15276: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.15279: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 29922 1726853652.15377: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.15493: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29922 1726853652.15496: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 29922 1726853652.15677: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.16208: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253db6690> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253954200> <<< 29922 1726853652.16232: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853652.16252: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253954560> <<< 29922 1726853652.16306: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253da7530> <<< 29922 1726853652.16327: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253db71d0> <<< 29922 1726853652.16347: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253db4e60> <<< 29922 1726853652.16374: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253db4800> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 29922 1726853652.16452: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 29922 1726853652.16599: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253957500> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253956db0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253956f90> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253956210> <<< 29922 1726853652.16618: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 29922 1726853652.16899: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253957680> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f62539ba1b0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62539b81d0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253db49b0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 29922 1726853652.16925: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.16950: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 29922 1726853652.17014: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.17092: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 29922 1726853652.17138: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.17213: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 29922 1726853652.17229: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 29922 1726853652.17267: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.17309: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 29922 1726853652.17312: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.17461: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 29922 1726853652.17464: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.17498: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 29922 1726853652.17518: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.17628: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29922 1726853652.17998: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 29922 1726853652.18285: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.18687: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 29922 1726853652.18702: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.18742: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.18793: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.18826: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.18855: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 29922 1726853652.18892: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 29922 1726853652.18918: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.18942: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 29922 1726853652.19004: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.19062: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 29922 1726853652.19104: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29922 1726853652.19144: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 29922 1726853652.19147: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.19176: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.19217: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 29922 1726853652.19220: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.19291: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.19385: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 29922 1726853652.19406: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62539bb500> <<< 29922 1726853652.19433: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 29922 1726853652.19530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 29922 1726853652.19590: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62539babd0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 29922 1726853652.19666: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.19728: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 29922 1726853652.19753: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.19829: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.19917: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 29922 1726853652.19989: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.20116: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 29922 1726853652.20123: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.20137: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.20164: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 29922 1726853652.20234: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 29922 1726853652.20291: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853652.20394: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f62539f2390> <<< 29922 1726853652.20567: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62539e2150> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 29922 1726853652.20628: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.20674: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 29922 1726853652.20762: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.20842: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.20954: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.21109: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 29922 1726853652.21202: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.21287: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 29922 1726853652.21290: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.21445: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253a06030> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62539e34a0> import 'ansible.module_utils.facts.system.user' # <<< 29922 1726853652.21479: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available <<< 29922 1726853652.21489: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 29922 1726853652.21649: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.21791: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 29922 1726853652.21891: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.21903: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.22005: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.22038: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.22114: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 29922 1726853652.22190: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29922 1726853652.22275: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.22439: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 29922 1726853652.22557: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.22824: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 29922 1726853652.22868: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.22882: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.23311: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.23831: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 29922 1726853652.23846: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 29922 1726853652.23947: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.24075: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 29922 1726853652.24180: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.24245: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 29922 1726853652.24267: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.24407: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.24642: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 29922 1726853652.24695: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 29922 1726853652.25089: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29922 1726853652.25094: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.25476: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 29922 1726853652.25481: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.25510: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.25576: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 29922 1726853652.25603: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29922 1726853652.25635: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 29922 1726853652.25698: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.25755: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 29922 1726853652.25777: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.25847: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.26064: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 29922 1726853652.26163: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.26423: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 29922 1726853652.26493: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.26625: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # <<< 29922 1726853652.26718: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # <<< 29922 1726853652.26730: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.26748: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.26779: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 29922 1726853652.26792: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.26962: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.27054: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 29922 1726853652.27103: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 29922 1726853652.27127: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.27176: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.27287: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29922 1726853652.27679: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 29922 1726853652.27707: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.27886: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 29922 1726853652.27941: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29922 1726853652.28033: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 29922 1726853652.28050: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.28177: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 29922 1726853652.28180: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.28193: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853652.28342: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 29922 1726853652.28682: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available <<< 29922 1726853652.29483: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 29922 1726853652.29513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 29922 1726853652.29534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 29922 1726853652.29589: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f625379e900> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625379ce00> <<< 29922 1726853652.29672: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253796480> <<< 29922 1726853652.40965: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62537e6900> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62537e51f0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853652.41095: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62537e6750> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62537e6150> <<< 29922 1726853652.41309: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 29922 1726853652.68700: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2990, "ansible_swaptotal_mb": 0,<<< 29922 1726853652.68767: stdout chunk (state=3): >>> "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 541, "free": 2990}, "nocache": {"free": 3308, "used": 223}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 796, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261798817792, "block_size": 4096, "block_total": 65519099, "block_available": 63915727, "block_used": 1603372, "inode_total": 131070960, "inode_available": 131029147, "inode_used": 41813, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "12", "epoch": "1726853652", "epoch_int": "1726853652", "date": "2024-09-20", "time": "13:34:12", "iso8601_micro": "2024-09-20T17:34:12.614081Z", "iso8601": "2024-09-20T17:34:12Z", "iso8601_basic": "20240920T133412614081", "iso8601_basic_short": "20240920T133412", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.47412109375, "5m": 0.484375, "15m": 0.2958984375}, "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "rpltstbr", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 29922 1726853652.69847: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing<<< 29922 1726853652.69853: stdout chunk (state=3): >>> ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd <<< 29922 1726853652.70021: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 29922 1726853652.70479: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 29922 1726853652.70503: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 29922 1726853652.70538: stdout chunk (state=3): >>># destroy _bz2 <<< 29922 1726853652.70541: stdout chunk (state=3): >>># destroy _compression # destroy _lzma <<< 29922 1726853652.70575: stdout chunk (state=3): >>># destroy _blake2 <<< 29922 1726853652.70599: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 29922 1726853652.70636: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob <<< 29922 1726853652.70639: stdout chunk (state=3): >>># destroy ipaddress <<< 29922 1726853652.70692: stdout chunk (state=3): >>># destroy ntpath <<< 29922 1726853652.70713: stdout chunk (state=3): >>># destroy importlib # destroy zipimport <<< 29922 1726853652.70789: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 29922 1726853652.70846: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 29922 1726853652.70996: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 29922 1726853652.71090: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 29922 1726853652.71132: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 29922 1726853652.71403: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime <<< 29922 1726853652.71406: stdout chunk (state=3): >>># cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 29922 1726853652.71664: stdout chunk (state=3): >>># destroy sys.monitoring <<< 29922 1726853652.71667: stdout chunk (state=3): >>># destroy _socket <<< 29922 1726853652.71700: stdout chunk (state=3): >>># destroy _collections <<< 29922 1726853652.71735: stdout chunk (state=3): >>># destroy platform <<< 29922 1726853652.71745: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath <<< 29922 1726853652.71773: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize <<< 29922 1726853652.71808: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 29922 1726853652.71830: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 29922 1726853652.71850: stdout chunk (state=3): >>># destroy _typing <<< 29922 1726853652.71882: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 29922 1726853652.71885: stdout chunk (state=3): >>># destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 29922 1726853652.71914: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 29922 1726853652.71950: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 29922 1726853652.72295: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 29922 1726853652.72595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853652.72718: stderr chunk (state=3): >>><<< 29922 1726853652.72724: stdout chunk (state=3): >>><<< 29922 1726853652.72956: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62549684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254937b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625496aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625473d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625473dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625477be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625477bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62547b37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62547b3e60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254793ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62547911f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254778fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62547d3770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62547d2390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254792090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62547d0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254808800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254778230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6254808cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254808b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6254808ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254776d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254809580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254809250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625480a480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62548206b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6254821d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254822c30> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6254823290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254822180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6254823d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254823440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625480a4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6254523bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f625454c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625454c410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f625454c6e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f625454d010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f625454da00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625454c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6254521d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625454ee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625454db50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625480abd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625457b140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625459b500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62545fc260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62545fe9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62545fc380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62545c5280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253f29370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625459a300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625454fd40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f625459a420> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ju1qv90p/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253f8eff0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253f6dee0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253f6d040> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253f8cec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253fc2990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253fc2720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253fc2030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253fc2480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253f8fc80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253fc3710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253fc3950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253fc3e90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e29b50> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e2b800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e2c200> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e2d100> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e2fe00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f625457b0b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e2e0c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e37d40> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e36810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e36570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e36ae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e2e5d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e7ba10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e7c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e7dc10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e7d9d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e801a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e7e2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e83950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e80350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e847d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e849b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e84cb0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e7c380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253d10380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253d116a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e86b10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253e87e90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e86720> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253d15970> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253d16780> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253d118e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253d16930> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253d17b60> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253d22330> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253d1dd90> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253e0ab70> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253efe840> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253d22030> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253d154f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253db6690> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253954200> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253954560> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253da7530> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253db71d0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253db4e60> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253db4800> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253957500> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253956db0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253956f90> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253956210> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253957680> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f62539ba1b0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62539b81d0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253db49b0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62539bb500> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62539babd0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f62539f2390> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62539e2150> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6253a06030> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62539e34a0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f625379e900> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f625379ce00> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6253796480> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62537e6900> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62537e51f0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62537e6750> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f62537e6150> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2990, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 541, "free": 2990}, "nocache": {"free": 3308, "used": 223}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 796, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261798817792, "block_size": 4096, "block_total": 65519099, "block_available": 63915727, "block_used": 1603372, "inode_total": 131070960, "inode_available": 131029147, "inode_used": 41813, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "12", "epoch": "1726853652", "epoch_int": "1726853652", "date": "2024-09-20", "time": "13:34:12", "iso8601_micro": "2024-09-20T17:34:12.614081Z", "iso8601": "2024-09-20T17:34:12Z", "iso8601_basic": "20240920T133412614081", "iso8601_basic_short": "20240920T133412", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.47412109375, "5m": 0.484375, "15m": 0.2958984375}, "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "rpltstbr", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 29922 1726853652.74613: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853651.1301575-29945-263767858701362/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853652.74616: _low_level_execute_command(): starting 29922 1726853652.74618: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853651.1301575-29945-263767858701362/ > /dev/null 2>&1 && sleep 0' 29922 1726853652.74620: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853652.74629: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853652.74634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853652.74636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 29922 1726853652.77079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853652.77084: stdout chunk (state=3): >>><<< 29922 1726853652.77087: stderr chunk (state=3): >>><<< 29922 1726853652.77089: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 29922 1726853652.77092: handler run complete 29922 1726853652.77127: variable 'ansible_facts' from source: unknown 29922 1726853652.77344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853652.77831: variable 'ansible_facts' from source: unknown 29922 1726853652.77922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853652.78063: attempt loop complete, returning result 29922 1726853652.78081: _execute() done 29922 1726853652.78089: dumping result to json 29922 1726853652.78128: done dumping result, returning 29922 1726853652.78141: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [02083763-bbaf-51d4-513b-0000000000af] 29922 1726853652.78150: sending task result for task 02083763-bbaf-51d4-513b-0000000000af 29922 1726853652.78892: done sending task result for task 02083763-bbaf-51d4-513b-0000000000af 29922 1726853652.78902: WORKER PROCESS EXITING ok: [managed_node3] 29922 1726853652.79024: no more pending results, returning what we have 29922 1726853652.79027: results queue empty 29922 1726853652.79027: checking for any_errors_fatal 29922 1726853652.79029: done checking for any_errors_fatal 29922 1726853652.79029: checking for max_fail_percentage 29922 1726853652.79031: done checking for max_fail_percentage 29922 1726853652.79031: checking to see if all hosts have failed and the running result is not ok 29922 1726853652.79032: done checking to see if all hosts have failed 29922 1726853652.79033: getting the remaining hosts for this loop 29922 1726853652.79035: done getting the remaining hosts for this loop 29922 1726853652.79038: getting the next task for host managed_node3 29922 1726853652.79044: done getting next task for host managed_node3 29922 1726853652.79045: ^ task is: TASK: meta (flush_handlers) 29922 1726853652.79047: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853652.79051: getting variables 29922 1726853652.79052: in VariableManager get_vars() 29922 1726853652.79075: Calling all_inventory to load vars for managed_node3 29922 1726853652.79078: Calling groups_inventory to load vars for managed_node3 29922 1726853652.79081: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853652.79089: Calling all_plugins_play to load vars for managed_node3 29922 1726853652.79092: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853652.79094: Calling groups_plugins_play to load vars for managed_node3 29922 1726853652.79265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853652.79474: done with get_vars() 29922 1726853652.79485: done getting variables 29922 1726853652.79551: in VariableManager get_vars() 29922 1726853652.79560: Calling all_inventory to load vars for managed_node3 29922 1726853652.79563: Calling groups_inventory to load vars for managed_node3 29922 1726853652.79565: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853652.79569: Calling all_plugins_play to load vars for managed_node3 29922 1726853652.79730: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853652.79734: Calling groups_plugins_play to load vars for managed_node3 29922 1726853652.80098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853652.80493: done with get_vars() 29922 1726853652.80508: done queuing things up, now waiting for results queue to drain 29922 1726853652.80510: results queue empty 29922 1726853652.80511: checking for any_errors_fatal 29922 1726853652.80513: done checking for any_errors_fatal 29922 1726853652.80514: checking for max_fail_percentage 29922 1726853652.80515: done checking for max_fail_percentage 29922 1726853652.80516: checking to see if all hosts have failed and the running result is not ok 29922 1726853652.80517: done checking to see if all hosts have failed 29922 1726853652.80522: getting the remaining hosts for this loop 29922 1726853652.80523: done getting the remaining hosts for this loop 29922 1726853652.80526: getting the next task for host managed_node3 29922 1726853652.80530: done getting next task for host managed_node3 29922 1726853652.80533: ^ task is: TASK: Include the task 'el_repo_setup.yml' 29922 1726853652.80535: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853652.80537: getting variables 29922 1726853652.80538: in VariableManager get_vars() 29922 1726853652.80547: Calling all_inventory to load vars for managed_node3 29922 1726853652.80549: Calling groups_inventory to load vars for managed_node3 29922 1726853652.80551: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853652.80556: Calling all_plugins_play to load vars for managed_node3 29922 1726853652.80558: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853652.80560: Calling groups_plugins_play to load vars for managed_node3 29922 1726853652.80827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853652.81197: done with get_vars() 29922 1726853652.81204: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:11 Friday 20 September 2024 13:34:12 -0400 (0:00:01.730) 0:00:01.742 ****** 29922 1726853652.81284: entering _queue_task() for managed_node3/include_tasks 29922 1726853652.81286: Creating lock for include_tasks 29922 1726853652.81618: worker is 1 (out of 1 available) 29922 1726853652.81630: exiting _queue_task() for managed_node3/include_tasks 29922 1726853652.81642: done queuing things up, now waiting for results queue to drain 29922 1726853652.81643: waiting for pending results... 29922 1726853652.81876: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 29922 1726853652.81967: in run() - task 02083763-bbaf-51d4-513b-000000000006 29922 1726853652.81994: variable 'ansible_search_path' from source: unknown 29922 1726853652.82033: calling self._execute() 29922 1726853652.82115: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853652.82127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853652.82140: variable 'omit' from source: magic vars 29922 1726853652.82251: _execute() done 29922 1726853652.82260: dumping result to json 29922 1726853652.82268: done dumping result, returning 29922 1726853652.82280: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [02083763-bbaf-51d4-513b-000000000006] 29922 1726853652.82290: sending task result for task 02083763-bbaf-51d4-513b-000000000006 29922 1726853652.82456: no more pending results, returning what we have 29922 1726853652.82460: in VariableManager get_vars() 29922 1726853652.82490: Calling all_inventory to load vars for managed_node3 29922 1726853652.82493: Calling groups_inventory to load vars for managed_node3 29922 1726853652.82496: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853652.82506: Calling all_plugins_play to load vars for managed_node3 29922 1726853652.82510: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853652.82512: Calling groups_plugins_play to load vars for managed_node3 29922 1726853652.82900: done sending task result for task 02083763-bbaf-51d4-513b-000000000006 29922 1726853652.82903: WORKER PROCESS EXITING 29922 1726853652.82920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853652.83105: done with get_vars() 29922 1726853652.83113: variable 'ansible_search_path' from source: unknown 29922 1726853652.83125: we have included files to process 29922 1726853652.83126: generating all_blocks data 29922 1726853652.83128: done generating all_blocks data 29922 1726853652.83128: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 29922 1726853652.83130: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 29922 1726853652.83132: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 29922 1726853652.83757: in VariableManager get_vars() 29922 1726853652.83772: done with get_vars() 29922 1726853652.83785: done processing included file 29922 1726853652.83787: iterating over new_blocks loaded from include file 29922 1726853652.83789: in VariableManager get_vars() 29922 1726853652.83798: done with get_vars() 29922 1726853652.83799: filtering new block on tags 29922 1726853652.83812: done filtering new block on tags 29922 1726853652.83814: in VariableManager get_vars() 29922 1726853652.83823: done with get_vars() 29922 1726853652.83824: filtering new block on tags 29922 1726853652.83839: done filtering new block on tags 29922 1726853652.83841: in VariableManager get_vars() 29922 1726853652.83849: done with get_vars() 29922 1726853652.83851: filtering new block on tags 29922 1726853652.83862: done filtering new block on tags 29922 1726853652.83864: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 29922 1726853652.83869: extending task lists for all hosts with included blocks 29922 1726853652.83915: done extending task lists 29922 1726853652.83916: done processing included files 29922 1726853652.83917: results queue empty 29922 1726853652.83918: checking for any_errors_fatal 29922 1726853652.83919: done checking for any_errors_fatal 29922 1726853652.83920: checking for max_fail_percentage 29922 1726853652.83921: done checking for max_fail_percentage 29922 1726853652.83922: checking to see if all hosts have failed and the running result is not ok 29922 1726853652.83923: done checking to see if all hosts have failed 29922 1726853652.83923: getting the remaining hosts for this loop 29922 1726853652.83924: done getting the remaining hosts for this loop 29922 1726853652.83926: getting the next task for host managed_node3 29922 1726853652.83930: done getting next task for host managed_node3 29922 1726853652.83932: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 29922 1726853652.83934: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853652.83936: getting variables 29922 1726853652.83937: in VariableManager get_vars() 29922 1726853652.83945: Calling all_inventory to load vars for managed_node3 29922 1726853652.83947: Calling groups_inventory to load vars for managed_node3 29922 1726853652.83950: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853652.83954: Calling all_plugins_play to load vars for managed_node3 29922 1726853652.83957: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853652.83959: Calling groups_plugins_play to load vars for managed_node3 29922 1726853652.84121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853652.84318: done with get_vars() 29922 1726853652.84327: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:34:12 -0400 (0:00:00.031) 0:00:01.773 ****** 29922 1726853652.84389: entering _queue_task() for managed_node3/setup 29922 1726853652.84637: worker is 1 (out of 1 available) 29922 1726853652.84650: exiting _queue_task() for managed_node3/setup 29922 1726853652.84661: done queuing things up, now waiting for results queue to drain 29922 1726853652.84662: waiting for pending results... 29922 1726853652.84882: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 29922 1726853652.84963: in run() - task 02083763-bbaf-51d4-513b-0000000000c0 29922 1726853652.84985: variable 'ansible_search_path' from source: unknown 29922 1726853652.84989: variable 'ansible_search_path' from source: unknown 29922 1726853652.85020: calling self._execute() 29922 1726853652.85097: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853652.85103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853652.85110: variable 'omit' from source: magic vars 29922 1726853652.85646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853652.87727: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853652.87796: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853652.87845: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853652.87886: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853652.87916: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853652.88004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853652.88040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853652.88077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853652.88123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853652.88158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853652.88324: variable 'ansible_facts' from source: unknown 29922 1726853652.88480: variable 'network_test_required_facts' from source: task vars 29922 1726853652.88483: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 29922 1726853652.88486: variable 'omit' from source: magic vars 29922 1726853652.88500: variable 'omit' from source: magic vars 29922 1726853652.88537: variable 'omit' from source: magic vars 29922 1726853652.88565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853652.88606: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853652.88629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853652.88677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853652.88680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853652.88708: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853652.88717: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853652.88725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853652.88876: Set connection var ansible_connection to ssh 29922 1726853652.88879: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853652.88881: Set connection var ansible_shell_executable to /bin/sh 29922 1726853652.88884: Set connection var ansible_pipelining to False 29922 1726853652.88886: Set connection var ansible_timeout to 10 29922 1726853652.88888: Set connection var ansible_shell_type to sh 29922 1726853652.88900: variable 'ansible_shell_executable' from source: unknown 29922 1726853652.88912: variable 'ansible_connection' from source: unknown 29922 1726853652.88921: variable 'ansible_module_compression' from source: unknown 29922 1726853652.88932: variable 'ansible_shell_type' from source: unknown 29922 1726853652.88939: variable 'ansible_shell_executable' from source: unknown 29922 1726853652.88946: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853652.88955: variable 'ansible_pipelining' from source: unknown 29922 1726853652.88962: variable 'ansible_timeout' from source: unknown 29922 1726853652.89021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853652.89132: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853652.89154: variable 'omit' from source: magic vars 29922 1726853652.89165: starting attempt loop 29922 1726853652.89174: running the handler 29922 1726853652.89193: _low_level_execute_command(): starting 29922 1726853652.89205: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853652.89909: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853652.89933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853652.89950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853652.90048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853652.90082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853652.90113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853652.90132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853652.90259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 29922 1726853652.92668: stdout chunk (state=3): >>>/root <<< 29922 1726853652.92858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853652.92897: stdout chunk (state=3): >>><<< 29922 1726853652.92900: stderr chunk (state=3): >>><<< 29922 1726853652.93026: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 29922 1726853652.93038: _low_level_execute_command(): starting 29922 1726853652.93041: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853652.9292738-30011-136506104260183 `" && echo ansible-tmp-1726853652.9292738-30011-136506104260183="` echo /root/.ansible/tmp/ansible-tmp-1726853652.9292738-30011-136506104260183 `" ) && sleep 0' 29922 1726853652.93632: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853652.93651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853652.93667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853652.93698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853652.93788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853652.93843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853652.93863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853652.93891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853652.93997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 29922 1726853652.96822: stdout chunk (state=3): >>>ansible-tmp-1726853652.9292738-30011-136506104260183=/root/.ansible/tmp/ansible-tmp-1726853652.9292738-30011-136506104260183 <<< 29922 1726853652.97055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853652.97058: stdout chunk (state=3): >>><<< 29922 1726853652.97060: stderr chunk (state=3): >>><<< 29922 1726853652.97277: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853652.9292738-30011-136506104260183=/root/.ansible/tmp/ansible-tmp-1726853652.9292738-30011-136506104260183 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 29922 1726853652.97281: variable 'ansible_module_compression' from source: unknown 29922 1726853652.97283: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 29922 1726853652.97285: variable 'ansible_facts' from source: unknown 29922 1726853652.97445: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853652.9292738-30011-136506104260183/AnsiballZ_setup.py 29922 1726853652.97637: Sending initial data 29922 1726853652.97646: Sent initial data (154 bytes) 29922 1726853652.98230: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853652.98249: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853652.98278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853652.98377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853652.98399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853652.98420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853652.98521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 29922 1726853653.00902: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 29922 1726853653.00937: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853653.01000: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853653.01081: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmps0eatvkc /root/.ansible/tmp/ansible-tmp-1726853652.9292738-30011-136506104260183/AnsiballZ_setup.py <<< 29922 1726853653.01085: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853652.9292738-30011-136506104260183/AnsiballZ_setup.py" <<< 29922 1726853653.01147: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmps0eatvkc" to remote "/root/.ansible/tmp/ansible-tmp-1726853652.9292738-30011-136506104260183/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853652.9292738-30011-136506104260183/AnsiballZ_setup.py" <<< 29922 1726853653.02905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853653.02910: stderr chunk (state=3): >>><<< 29922 1726853653.02912: stdout chunk (state=3): >>><<< 29922 1726853653.02914: done transferring module to remote 29922 1726853653.02917: _low_level_execute_command(): starting 29922 1726853653.02919: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853652.9292738-30011-136506104260183/ /root/.ansible/tmp/ansible-tmp-1726853652.9292738-30011-136506104260183/AnsiballZ_setup.py && sleep 0' 29922 1726853653.03943: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853653.03946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853653.03949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853653.03988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853653.03992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853653.04096: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853653.04109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853653.04125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853653.04214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853653.06631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853653.06643: stdout chunk (state=3): >>><<< 29922 1726853653.06687: stderr chunk (state=3): >>><<< 29922 1726853653.06712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853653.06727: _low_level_execute_command(): starting 29922 1726853653.06737: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853652.9292738-30011-136506104260183/AnsiballZ_setup.py && sleep 0' 29922 1726853653.07613: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853653.07635: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853653.07695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853653.07715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853653.07774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853653.07795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853653.07885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853653.07913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853653.07939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853653.08086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853653.10918: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 29922 1726853653.10989: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 29922 1726853653.11065: stdout chunk (state=3): >>>import 'posix' # <<< 29922 1726853653.11086: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 29922 1726853653.11100: stdout chunk (state=3): >>>import 'time' # <<< 29922 1726853653.11193: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853653.11206: stdout chunk (state=3): >>>import '_codecs' # <<< 29922 1726853653.11240: stdout chunk (state=3): >>>import 'codecs' # <<< 29922 1726853653.11291: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 29922 1726853653.11318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 29922 1726853653.11331: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20373684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037337b30> <<< 29922 1726853653.11362: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 29922 1726853653.11394: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203736aa50> import '_signal' # <<< 29922 1726853653.11439: stdout chunk (state=3): >>>import '_abc' # <<< 29922 1726853653.11468: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 29922 1726853653.11496: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 29922 1726853653.11627: stdout chunk (state=3): >>>import '_collections_abc' # <<< 29922 1726853653.11710: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # <<< 29922 1726853653.11713: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 29922 1726853653.11844: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 29922 1726853653.11992: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203713d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 29922 1726853653.11996: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203713dfa0> import 'site' # <<< 29922 1726853653.12008: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 29922 1726853653.12341: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 29922 1726853653.12383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 29922 1726853653.12407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 29922 1726853653.12611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203717bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203717bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 29922 1726853653.12615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 29922 1726853653.12638: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 29922 1726853653.12688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853653.12727: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20371b3830> <<< 29922 1726853653.12770: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 29922 1726853653.12788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20371b3ec0> import '_collections' # <<< 29922 1726853653.12846: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037193b60> import '_functools' # <<< 29922 1726853653.12877: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20371912b0> <<< 29922 1726853653.12978: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037179070> <<< 29922 1726853653.13015: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 29922 1726853653.13024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 29922 1726853653.13046: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 29922 1726853653.13138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 29922 1726853653.13436: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20371d37d0> <<< 29922 1726853653.13442: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20371d23f0> <<< 29922 1726853653.13450: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037192150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20371d0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037208890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20371782f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2037208d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037208bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2037208fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037176e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 29922 1726853653.13484: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037209670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037209370> import 'importlib.machinery' # <<< 29922 1726853653.13518: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 29922 1726853653.13549: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203720a540> import 'importlib.util' # <<< 29922 1726853653.13586: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 29922 1726853653.13612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 29922 1726853653.13651: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037220740> <<< 29922 1726853653.13669: stdout chunk (state=3): >>>import 'errno' # <<< 29922 1726853653.13707: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2037221e20> <<< 29922 1726853653.13733: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 29922 1726853653.13769: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 29922 1726853653.13832: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037222cc0> <<< 29922 1726853653.13836: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20372232f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037222210> <<< 29922 1726853653.13943: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2037223d70> <<< 29922 1726853653.13947: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20372234a0> <<< 29922 1726853653.13983: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203720a4b0> <<< 29922 1726853653.14127: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 29922 1726853653.14138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 29922 1726853653.14141: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 29922 1726853653.14160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 29922 1726853653.14184: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036f23c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036f4c710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f4c470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036f4c740> <<< 29922 1726853653.14214: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 29922 1726853653.14321: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853653.14463: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036f4d070> <<< 29922 1726853653.14546: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853653.14704: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036f4da60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f4c920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f21df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f4ee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f4db50> <<< 29922 1726853653.14734: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203720ac60> <<< 29922 1726853653.14738: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 29922 1726853653.14809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853653.14826: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 29922 1726853653.14849: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 29922 1726853653.14882: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f771a0> <<< 29922 1726853653.14939: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 29922 1726853653.14962: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853653.15218: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 29922 1726853653.15221: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 29922 1726853653.15224: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f9b560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036ffc2c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 29922 1726853653.15241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 29922 1726853653.15263: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 29922 1726853653.15289: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 29922 1726853653.15379: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036ffea20> <<< 29922 1726853653.15451: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036ffc3e0> <<< 29922 1726853653.15488: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036fbd2b0> <<< 29922 1726853653.15512: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 29922 1726853653.15540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20369293d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f9a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f4fd70> <<< 29922 1726853653.15716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 29922 1726853653.15756: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2036929670> <<< 29922 1726853653.16079: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_3y3ykb2f/ansible_setup_payload.zip' # zipimport: zlib available <<< 29922 1726853653.16194: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 29922 1726853653.16211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 29922 1726853653.16246: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 29922 1726853653.16322: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 29922 1726853653.16353: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036993170> <<< 29922 1726853653.16409: stdout chunk (state=3): >>>import '_typing' # <<< 29922 1726853653.16632: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036972060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20369711f0> <<< 29922 1726853653.16641: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.16673: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 29922 1726853653.18086: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.19249: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036991040> <<< 29922 1726853653.19324: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853653.19389: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 29922 1726853653.19403: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20369c2ae0> <<< 29922 1726853653.19537: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20369c2870> <<< 29922 1726853653.19541: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20369c2180> <<< 29922 1726853653.19551: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20369c2bd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036993b90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20369c3860> <<< 29922 1726853653.19596: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20369c39b0> <<< 29922 1726853653.19710: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 29922 1726853653.19744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 29922 1726853653.19748: stdout chunk (state=3): >>>import '_locale' # <<< 29922 1726853653.19773: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20369c3ef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 29922 1726853653.19807: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203682dd00> <<< 29922 1726853653.19933: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f203682f920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 29922 1726853653.19936: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 29922 1726853653.19976: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20368302f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20368311f0> <<< 29922 1726853653.20005: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 29922 1726853653.20068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 29922 1726853653.20116: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036833f20> <<< 29922 1726853653.20210: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20368385c0> <<< 29922 1726853653.20228: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036832210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 29922 1726853653.20294: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 29922 1726853653.20306: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 29922 1726853653.20395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 29922 1726853653.20487: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203683bf80> <<< 29922 1726853653.20536: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203683aa50> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203683a7b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 29922 1726853653.20614: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203683ad20> <<< 29922 1726853653.20799: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20368326f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20368801a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036880350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 29922 1726853653.20816: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036881df0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036881bb0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 29922 1726853653.20848: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 29922 1726853653.20904: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20368842f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036882480> <<< 29922 1726853653.20921: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 29922 1726853653.20970: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853653.21203: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036887ad0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20368844a0> <<< 29922 1726853653.21238: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036888b90> <<< 29922 1726853653.21270: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036885010> <<< 29922 1726853653.21321: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036888cb0> <<< 29922 1726853653.21346: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036880560> <<< 29922 1726853653.21375: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 29922 1726853653.21399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 29922 1726853653.21436: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853653.21459: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036710320> <<< 29922 1726853653.21601: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853653.21663: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20367113d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203688aae0> <<< 29922 1726853653.21707: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f203688be90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203688a720> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 29922 1726853653.21725: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.21924: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 29922 1726853653.21956: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 29922 1726853653.21966: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.22082: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.22209: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.22756: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.23536: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036719700> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203671a630> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20367114f0> <<< 29922 1726853653.23583: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 29922 1726853653.23613: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29922 1726853653.23639: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 29922 1726853653.23782: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.23939: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 29922 1726853653.24073: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203671acc0> # zipimport: zlib available <<< 29922 1726853653.24433: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.24874: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.24940: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.25015: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 29922 1726853653.25030: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.25182: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 29922 1726853653.25185: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.25265: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 29922 1726853653.25293: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 29922 1726853653.25308: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.25346: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.25392: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 29922 1726853653.25403: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.25618: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.25846: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 29922 1726853653.25964: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 29922 1726853653.25990: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203671b890> <<< 29922 1726853653.26008: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.26076: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.26145: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 29922 1726853653.26181: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 29922 1726853653.26306: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.26310: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.26326: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 29922 1726853653.26377: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.26486: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.26503: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 29922 1726853653.26523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853653.26798: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036726300> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036721190> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available <<< 29922 1726853653.26826: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.26859: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.26910: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853653.26926: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 29922 1726853653.26959: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 29922 1726853653.26981: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 29922 1726853653.27033: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 29922 1726853653.27048: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 29922 1726853653.27070: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 29922 1726853653.27124: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203680ec00> <<< 29922 1726853653.27169: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20369ee8d0> <<< 29922 1726853653.27250: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036726480> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036973260> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 29922 1726853653.27269: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.27293: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.27320: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 29922 1726853653.27391: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 29922 1726853653.27419: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 29922 1726853653.27431: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.27487: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.27591: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.27594: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.27597: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.27632: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.27700: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.27786: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 29922 1726853653.27998: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.28011: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 29922 1726853653.28145: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.28317: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.28387: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.28443: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 29922 1726853653.28470: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 29922 1726853653.28497: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 29922 1726853653.28558: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20367b63c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 29922 1726853653.28667: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 29922 1726853653.28707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036360320> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853653.28720: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036360710> <<< 29922 1726853653.28782: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203679cb60> <<< 29922 1726853653.28794: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20367b6f30> <<< 29922 1726853653.28825: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20367b4aa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20367b5490> <<< 29922 1726853653.28849: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 29922 1726853653.28908: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 29922 1726853653.28933: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 29922 1726853653.29097: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036363680> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036362f30> <<< 29922 1726853653.29102: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036363110> <<< 29922 1726853653.29122: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036362360> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 29922 1726853653.29208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 29922 1726853653.29224: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20363637a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 29922 1726853653.29256: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 29922 1726853653.29355: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20363ae2d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20363ac2f0> <<< 29922 1726853653.29358: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20367b4650> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 29922 1726853653.29416: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.29433: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 29922 1726853653.29472: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.29637: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 29922 1726853653.29678: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available <<< 29922 1726853653.29690: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 29922 1726853653.29721: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.29761: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 29922 1726853653.29810: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.29863: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 29922 1726853653.29911: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.29967: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 29922 1726853653.29982: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.30201: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 29922 1726853653.30204: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 29922 1726853653.30695: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.31115: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 29922 1726853653.31145: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.31189: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.31228: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.31370: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.31407: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 29922 1726853653.31436: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.31492: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 29922 1726853653.31589: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.31636: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 29922 1726853653.31718: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.31819: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 29922 1726853653.31875: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20363aff50> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 29922 1726853653.31929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 29922 1726853653.32036: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20363af020> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 29922 1726853653.32091: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.32153: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 29922 1726853653.32169: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.32360: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.32363: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 29922 1726853653.32414: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.32490: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 29922 1726853653.32535: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.32590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 29922 1726853653.32636: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 29922 1726853653.32713: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853653.32896: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20363ee4e0> <<< 29922 1726853653.32984: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20363df080> import 'ansible.module_utils.facts.system.python' # <<< 29922 1726853653.33001: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.33046: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.33103: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 29922 1726853653.33119: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.33192: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.33275: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.33387: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.33539: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 29922 1726853653.33656: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.33679: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 29922 1726853653.33723: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 29922 1726853653.33735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 29922 1726853653.33812: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853653.33816: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036402060> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036401ca0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 29922 1726853653.33885: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 29922 1726853653.33888: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.33920: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 29922 1726853653.34090: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.34233: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 29922 1726853653.34255: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.34417: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.34446: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.34487: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.34537: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 29922 1726853653.34554: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.34581: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.34594: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.34732: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.34880: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 29922 1726853653.35128: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.35131: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.35134: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 29922 1726853653.35164: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.35204: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.35774: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.36290: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 29922 1726853653.36391: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 29922 1726853653.36410: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.36519: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 29922 1726853653.36628: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.36722: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 29922 1726853653.36743: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.36885: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.37041: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 29922 1726853653.37070: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 29922 1726853653.37179: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.37198: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 29922 1726853653.37276: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.37369: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.37575: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.37777: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 29922 1726853653.37832: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29922 1726853653.37863: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 29922 1726853653.37913: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.37933: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 29922 1726853653.37942: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.38030: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.38156: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 29922 1726853653.38201: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.38262: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 29922 1726853653.38386: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # <<< 29922 1726853653.38399: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.38648: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.38916: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 29922 1726853653.38991: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.39034: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 29922 1726853653.39050: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.39078: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.39147: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 29922 1726853653.39160: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.39253: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 29922 1726853653.39275: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 29922 1726853653.39354: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.39484: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 29922 1726853653.39487: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 29922 1726853653.39523: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29922 1726853653.39618: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 29922 1726853653.39876: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.39921: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 29922 1726853653.39934: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853653.42211: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036203950> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036200350> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20362003e0> <<< 29922 1726853653.43366: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "13", "epoch": "1726853653", "epoch_int": "1726853653", "date": "2024-09-20", "time": "13:34:13", "iso8601_micro": "2024-09-20T17:34:13.415863Z", "iso8601": "2024-09-20T17:34:13Z", "iso8601_basic": "20240920T133413415863", "iso8601_basic_short": "20240920T133413", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_service_mgr": "systemd", "ansible_local": {}, "ansi<<< 29922 1726853653.43538: stdout chunk (state=3): >>>ble_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 29922 1726853653.43924: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 29922 1726853653.43959: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread <<< 29922 1726853653.43987: stdout chunk (state=3): >>># cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site<<< 29922 1726853653.44078: stdout chunk (state=3): >>> # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect <<< 29922 1726853653.44163: stdout chunk (state=3): >>># cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base <<< 29922 1726853653.44464: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 29922 1726853653.44656: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 29922 1726853653.44709: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 29922 1726853653.44712: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 29922 1726853653.44779: stdout chunk (state=3): >>># destroy ntpath <<< 29922 1726853653.44786: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 29922 1726853653.44900: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 29922 1726853653.44953: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal <<< 29922 1726853653.44988: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue <<< 29922 1726853653.45141: stdout chunk (state=3): >>># destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 29922 1726853653.45155: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 29922 1726853653.45208: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 29922 1726853653.45243: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc <<< 29922 1726853653.45287: stdout chunk (state=3): >>># cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 29922 1726853653.45479: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 29922 1726853653.45483: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 29922 1726853653.45611: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 29922 1726853653.45636: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 29922 1726853653.45705: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 29922 1726853653.45727: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 29922 1726853653.45952: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 29922 1726853653.45955: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 29922 1726853653.46410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853653.46420: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 29922 1726853653.46515: stderr chunk (state=3): >>><<< 29922 1726853653.46817: stdout chunk (state=3): >>><<< 29922 1726853653.46843: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20373684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037337b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203736aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203713d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203713dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203717bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203717bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20371b3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20371b3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037193b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20371912b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037179070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20371d37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20371d23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037192150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20371d0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037208890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20371782f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2037208d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037208bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2037208fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037176e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037209670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037209370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203720a540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037220740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2037221e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037222cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20372232f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2037222210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2037223d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20372234a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203720a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036f23c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036f4c710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f4c470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036f4c740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036f4d070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036f4da60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f4c920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f21df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f4ee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f4db50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203720ac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f771a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f9b560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036ffc2c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036ffea20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036ffc3e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036fbd2b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20369293d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f9a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036f4fd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2036929670> # zipimport: found 103 names in '/tmp/ansible_setup_payload_3y3ykb2f/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036993170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036972060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20369711f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036991040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20369c2ae0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20369c2870> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20369c2180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20369c2bd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036993b90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20369c3860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20369c39b0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20369c3ef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203682dd00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f203682f920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20368302f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20368311f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036833f20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20368385c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036832210> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203683bf80> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203683aa50> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203683a7b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203683ad20> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20368326f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20368801a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036880350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036881df0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036881bb0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20368842f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036882480> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036887ad0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20368844a0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036888b90> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036885010> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036888cb0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036880560> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036710320> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20367113d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203688aae0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f203688be90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203688a720> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036719700> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203671a630> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20367114f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203671acc0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203671b890> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036726300> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036721190> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203680ec00> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20369ee8d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036726480> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036973260> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20367b63c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036360320> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036360710> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f203679cb60> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20367b6f30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20367b4aa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20367b5490> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036363680> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036362f30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036363110> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036362360> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20363637a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20363ae2d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20363ac2f0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20367b4650> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20363aff50> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20363af020> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f20363ee4e0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20363df080> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036402060> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036401ca0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2036203950> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2036200350> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f20362003e0> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "13", "epoch": "1726853653", "epoch_int": "1726853653", "date": "2024-09-20", "time": "13:34:13", "iso8601_micro": "2024-09-20T17:34:13.415863Z", "iso8601": "2024-09-20T17:34:13Z", "iso8601_basic": "20240920T133413415863", "iso8601_basic_short": "20240920T133413", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 29922 1726853653.49215: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853652.9292738-30011-136506104260183/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853653.49220: _low_level_execute_command(): starting 29922 1726853653.49222: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853652.9292738-30011-136506104260183/ > /dev/null 2>&1 && sleep 0' 29922 1726853653.49225: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853653.49227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853653.49229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853653.49232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853653.49234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853653.51324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853653.51328: stderr chunk (state=3): >>><<< 29922 1726853653.51330: stdout chunk (state=3): >>><<< 29922 1726853653.51332: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853653.51334: handler run complete 29922 1726853653.51336: variable 'ansible_facts' from source: unknown 29922 1726853653.51678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853653.51914: variable 'ansible_facts' from source: unknown 29922 1726853653.51976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853653.52152: attempt loop complete, returning result 29922 1726853653.52283: _execute() done 29922 1726853653.52318: dumping result to json 29922 1726853653.52477: done dumping result, returning 29922 1726853653.52480: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [02083763-bbaf-51d4-513b-0000000000c0] 29922 1726853653.52483: sending task result for task 02083763-bbaf-51d4-513b-0000000000c0 29922 1726853653.52876: done sending task result for task 02083763-bbaf-51d4-513b-0000000000c0 29922 1726853653.52880: WORKER PROCESS EXITING ok: [managed_node3] 29922 1726853653.52991: no more pending results, returning what we have 29922 1726853653.52994: results queue empty 29922 1726853653.52995: checking for any_errors_fatal 29922 1726853653.52996: done checking for any_errors_fatal 29922 1726853653.52997: checking for max_fail_percentage 29922 1726853653.52998: done checking for max_fail_percentage 29922 1726853653.52999: checking to see if all hosts have failed and the running result is not ok 29922 1726853653.53000: done checking to see if all hosts have failed 29922 1726853653.53000: getting the remaining hosts for this loop 29922 1726853653.53002: done getting the remaining hosts for this loop 29922 1726853653.53006: getting the next task for host managed_node3 29922 1726853653.53014: done getting next task for host managed_node3 29922 1726853653.53017: ^ task is: TASK: Check if system is ostree 29922 1726853653.53020: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853653.53023: getting variables 29922 1726853653.53025: in VariableManager get_vars() 29922 1726853653.53053: Calling all_inventory to load vars for managed_node3 29922 1726853653.53056: Calling groups_inventory to load vars for managed_node3 29922 1726853653.53061: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853653.53477: Calling all_plugins_play to load vars for managed_node3 29922 1726853653.53482: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853653.53487: Calling groups_plugins_play to load vars for managed_node3 29922 1726853653.54251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853653.54979: done with get_vars() 29922 1726853653.54993: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:34:13 -0400 (0:00:00.707) 0:00:02.480 ****** 29922 1726853653.55098: entering _queue_task() for managed_node3/stat 29922 1726853653.56111: worker is 1 (out of 1 available) 29922 1726853653.56123: exiting _queue_task() for managed_node3/stat 29922 1726853653.56134: done queuing things up, now waiting for results queue to drain 29922 1726853653.56136: waiting for pending results... 29922 1726853653.56421: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 29922 1726853653.56780: in run() - task 02083763-bbaf-51d4-513b-0000000000c2 29922 1726853653.56784: variable 'ansible_search_path' from source: unknown 29922 1726853653.56786: variable 'ansible_search_path' from source: unknown 29922 1726853653.56789: calling self._execute() 29922 1726853653.56792: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853653.56794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853653.56797: variable 'omit' from source: magic vars 29922 1726853653.57767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853653.58320: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853653.58577: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853653.58581: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853653.58583: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853653.58762: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853653.58892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853653.58925: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853653.58953: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853653.59251: Evaluated conditional (not __network_is_ostree is defined): True 29922 1726853653.59261: variable 'omit' from source: magic vars 29922 1726853653.59297: variable 'omit' from source: magic vars 29922 1726853653.59332: variable 'omit' from source: magic vars 29922 1726853653.59478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853653.59581: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853653.59601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853653.59648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853653.59661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853653.59804: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853653.59807: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853653.59812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853653.60020: Set connection var ansible_connection to ssh 29922 1726853653.60177: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853653.60181: Set connection var ansible_shell_executable to /bin/sh 29922 1726853653.60184: Set connection var ansible_pipelining to False 29922 1726853653.60186: Set connection var ansible_timeout to 10 29922 1726853653.60188: Set connection var ansible_shell_type to sh 29922 1726853653.60190: variable 'ansible_shell_executable' from source: unknown 29922 1726853653.60191: variable 'ansible_connection' from source: unknown 29922 1726853653.60194: variable 'ansible_module_compression' from source: unknown 29922 1726853653.60196: variable 'ansible_shell_type' from source: unknown 29922 1726853653.60203: variable 'ansible_shell_executable' from source: unknown 29922 1726853653.60207: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853653.60209: variable 'ansible_pipelining' from source: unknown 29922 1726853653.60251: variable 'ansible_timeout' from source: unknown 29922 1726853653.60259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853653.60574: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853653.60678: variable 'omit' from source: magic vars 29922 1726853653.60681: starting attempt loop 29922 1726853653.60682: running the handler 29922 1726853653.60686: _low_level_execute_command(): starting 29922 1726853653.60699: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853653.62263: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853653.62269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853653.62275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853653.62408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853653.62467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853653.62472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853653.62642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853653.65010: stdout chunk (state=3): >>>/root <<< 29922 1726853653.65131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853653.65335: stderr chunk (state=3): >>><<< 29922 1726853653.65339: stdout chunk (state=3): >>><<< 29922 1726853653.65344: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853653.65357: _low_level_execute_command(): starting 29922 1726853653.65529: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853653.6531527-30037-12760472044349 `" && echo ansible-tmp-1726853653.6531527-30037-12760472044349="` echo /root/.ansible/tmp/ansible-tmp-1726853653.6531527-30037-12760472044349 `" ) && sleep 0' 29922 1726853653.66607: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853653.66611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853653.66614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853653.66616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853653.66863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853653.66967: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853653.67053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853653.69997: stdout chunk (state=3): >>>ansible-tmp-1726853653.6531527-30037-12760472044349=/root/.ansible/tmp/ansible-tmp-1726853653.6531527-30037-12760472044349 <<< 29922 1726853653.70002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853653.70004: stdout chunk (state=3): >>><<< 29922 1726853653.70006: stderr chunk (state=3): >>><<< 29922 1726853653.70008: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853653.6531527-30037-12760472044349=/root/.ansible/tmp/ansible-tmp-1726853653.6531527-30037-12760472044349 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853653.70278: variable 'ansible_module_compression' from source: unknown 29922 1726853653.70282: ANSIBALLZ: Using lock for stat 29922 1726853653.70284: ANSIBALLZ: Acquiring lock 29922 1726853653.70286: ANSIBALLZ: Lock acquired: 140376041362048 29922 1726853653.70288: ANSIBALLZ: Creating module 29922 1726853653.99755: ANSIBALLZ: Writing module into payload 29922 1726853653.99759: ANSIBALLZ: Writing module 29922 1726853653.99889: ANSIBALLZ: Renaming module 29922 1726853653.99900: ANSIBALLZ: Done creating module 29922 1726853653.99920: variable 'ansible_facts' from source: unknown 29922 1726853654.00110: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853653.6531527-30037-12760472044349/AnsiballZ_stat.py 29922 1726853654.00422: Sending initial data 29922 1726853654.00485: Sent initial data (152 bytes) 29922 1726853654.01597: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853654.01614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853654.01701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853654.01951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853654.01954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853654.01957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853654.02065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853654.03750: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 29922 1726853654.03782: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853654.03863: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853654.03963: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp4ky75peu /root/.ansible/tmp/ansible-tmp-1726853653.6531527-30037-12760472044349/AnsiballZ_stat.py <<< 29922 1726853654.03974: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853653.6531527-30037-12760472044349/AnsiballZ_stat.py" <<< 29922 1726853654.04035: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp4ky75peu" to remote "/root/.ansible/tmp/ansible-tmp-1726853653.6531527-30037-12760472044349/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853653.6531527-30037-12760472044349/AnsiballZ_stat.py" <<< 29922 1726853654.05345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853654.05348: stdout chunk (state=3): >>><<< 29922 1726853654.05350: stderr chunk (state=3): >>><<< 29922 1726853654.05490: done transferring module to remote 29922 1726853654.05497: _low_level_execute_command(): starting 29922 1726853654.05500: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853653.6531527-30037-12760472044349/ /root/.ansible/tmp/ansible-tmp-1726853653.6531527-30037-12760472044349/AnsiballZ_stat.py && sleep 0' 29922 1726853654.07042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853654.07058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853654.07156: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853654.07216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853654.07380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853654.09238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853654.09269: stderr chunk (state=3): >>><<< 29922 1726853654.09435: stdout chunk (state=3): >>><<< 29922 1726853654.09439: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853654.09443: _low_level_execute_command(): starting 29922 1726853654.09445: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853653.6531527-30037-12760472044349/AnsiballZ_stat.py && sleep 0' 29922 1726853654.10176: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853654.10180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853654.10183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853654.10185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853654.10187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853654.10189: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853654.10191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853654.10193: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853654.10195: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853654.10208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853654.10211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853654.10351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853654.12598: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 29922 1726853654.12625: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # <<< 29922 1726853654.12651: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 29922 1726853654.12704: stdout chunk (state=3): >>>import '_io' # <<< 29922 1726853654.12733: stdout chunk (state=3): >>>import 'marshal' # <<< 29922 1726853654.12754: stdout chunk (state=3): >>>import 'posix' # <<< 29922 1726853654.12790: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 29922 1726853654.12818: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 29922 1726853654.12882: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853654.12906: stdout chunk (state=3): >>>import '_codecs' # <<< 29922 1726853654.12918: stdout chunk (state=3): >>>import 'codecs' # <<< 29922 1726853654.12944: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 29922 1726853654.12976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65c184d0> <<< 29922 1726853654.13018: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65be7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 29922 1726853654.13077: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65c1aa50> import '_signal' # import '_abc' # <<< 29922 1726853654.13185: stdout chunk (state=3): >>>import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 29922 1726853654.13213: stdout chunk (state=3): >>>import '_collections_abc' # <<< 29922 1726853654.13316: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 29922 1726853654.13319: stdout chunk (state=3): >>>import 'os' # <<< 29922 1726853654.13379: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e659c9130> <<< 29922 1726853654.13512: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e659c9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 29922 1726853654.13732: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 29922 1726853654.13977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 29922 1726853654.13981: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 29922 1726853654.13987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 29922 1726853654.13990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 29922 1726853654.13993: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 29922 1726853654.13995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a07e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a07f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 29922 1726853654.14015: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 29922 1726853654.14132: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a3f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 29922 1726853654.14151: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a3ff20> import '_collections' # <<< 29922 1726853654.14220: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a1fb60> <<< 29922 1726853654.14231: stdout chunk (state=3): >>>import '_functools' # <<< 29922 1726853654.14255: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a1d280> <<< 29922 1726853654.14344: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a05040> <<< 29922 1726853654.14451: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 29922 1726853654.14479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 29922 1726853654.14493: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 29922 1726853654.14514: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 29922 1726853654.14542: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a5f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a5e420> <<< 29922 1726853654.14779: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a1e150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a5cc80> <<< 29922 1726853654.14822: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a94890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a042c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65a94d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a94bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65a94fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a02de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a956d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a953a0> import 'importlib.machinery' # <<< 29922 1726853654.14865: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 29922 1726853654.14903: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a965d0> import 'importlib.util' # import 'runpy' # <<< 29922 1726853654.14923: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 29922 1726853654.14960: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 29922 1726853654.15017: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65aac7a0> <<< 29922 1726853654.15025: stdout chunk (state=3): >>>import 'errno' # <<< 29922 1726853654.15059: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65aadeb0> <<< 29922 1726853654.15088: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 29922 1726853654.15108: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65aaed50> <<< 29922 1726853654.15163: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65aaf380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65aae2a0> <<< 29922 1726853654.15189: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 29922 1726853654.15198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 29922 1726853654.15242: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853654.15256: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65aafe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65aaf530> <<< 29922 1726853654.15383: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a96570> <<< 29922 1726853654.15387: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 29922 1726853654.15412: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 29922 1726853654.15466: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65833ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 29922 1726853654.15544: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6585c830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6585c590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6585c770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 29922 1726853654.15569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 29922 1726853654.15639: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853654.16012: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6585d100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6585daf0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6585c9b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65831e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 29922 1726853654.16016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 29922 1726853654.16022: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 29922 1726853654.16040: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 29922 1726853654.16053: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6585eea0> <<< 29922 1726853654.16089: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6585dc10> <<< 29922 1726853654.16105: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a96cc0> <<< 29922 1726853654.16136: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 29922 1726853654.16241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 29922 1726853654.16316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65887230> <<< 29922 1726853654.16385: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 29922 1726853654.16405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853654.16424: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 29922 1726853654.16508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e658ab590> <<< 29922 1726853654.16527: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 29922 1726853654.16588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 29922 1726853654.16669: stdout chunk (state=3): >>>import 'ntpath' # <<< 29922 1726853654.16700: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6590c2f0> <<< 29922 1726853654.16757: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 29922 1726853654.16781: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 29922 1726853654.16837: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 29922 1726853654.16965: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6590ea50> <<< 29922 1726853654.17114: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6590c410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e658d13a0> <<< 29922 1726853654.17178: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e657113d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e658aa3c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6585fe00> <<< 29922 1726853654.17337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 29922 1726853654.17381: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8e65711670> <<< 29922 1726853654.17617: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_ntytxjpd/ansible_stat_payload.zip' <<< 29922 1726853654.17628: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.17863: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 29922 1726853654.17876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 29922 1726853654.17932: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 29922 1726853654.18066: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 29922 1726853654.18081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e657670b0> <<< 29922 1726853654.18110: stdout chunk (state=3): >>>import '_typing' # <<< 29922 1726853654.18356: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65745fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65745160> <<< 29922 1726853654.18392: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # <<< 29922 1726853654.18415: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.18424: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.18477: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 29922 1726853654.20295: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.22084: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65764f80> <<< 29922 1726853654.22570: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc'<<< 29922 1726853654.22579: stdout chunk (state=3): >>> # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6578eae0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6578e870> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6578e180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6578e5d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65767d40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6578f800> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6578fa40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 29922 1726853654.22662: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6578ff80> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 29922 1726853654.22665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 29922 1726853654.22706: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65115d00> <<< 29922 1726853654.22750: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65117920> <<< 29922 1726853654.22768: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 29922 1726853654.22793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 29922 1726853654.22859: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65118290> <<< 29922 1726853654.22863: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 29922 1726853654.22883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 29922 1726853654.22913: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65119430> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 29922 1726853654.22951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 29922 1726853654.22979: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 29922 1726853654.23031: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6511be60> <<< 29922 1726853654.23076: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65a02ed0> <<< 29922 1726853654.23095: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6511a120> <<< 29922 1726853654.23105: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 29922 1726853654.23143: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 29922 1726853654.23175: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 29922 1726853654.23192: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 29922 1726853654.23219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 29922 1726853654.23245: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 29922 1726853654.23250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 29922 1726853654.23270: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65123d10> <<< 29922 1726853654.23294: stdout chunk (state=3): >>>import '_tokenize' # <<< 29922 1726853654.23342: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e651227e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65122540> <<< 29922 1726853654.23353: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 29922 1726853654.23387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 29922 1726853654.23457: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65122ab0> <<< 29922 1726853654.23500: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6511a630> <<< 29922 1726853654.23504: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6516bfb0> <<< 29922 1726853654.23535: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6516c0b0> <<< 29922 1726853654.23567: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 29922 1726853654.23585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 29922 1726853654.23605: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 29922 1726853654.23650: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853654.23670: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6516db80> <<< 29922 1726853654.23680: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6516d940> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 29922 1726853654.24206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6516ffe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6516e1b0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65173830> <<< 29922 1726853654.24302: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65170200> <<< 29922 1726853654.24376: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853654.24379: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e651745c0> <<< 29922 1726853654.24415: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65174a40> <<< 29922 1726853654.24473: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853654.24483: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e651749e0> <<< 29922 1726853654.24500: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6516c200> <<< 29922 1726853654.24524: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 29922 1726853654.24531: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 29922 1726853654.24548: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 29922 1726853654.24585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 29922 1726853654.24612: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 29922 1726853654.24649: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65000170> <<< 29922 1726853654.25451: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65001280> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65176900> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65177cb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65176540> # zipimport: zlib available<<< 29922 1726853654.25456: stdout chunk (state=3): >>> # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 29922 1726853654.25547: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.25740: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.26627: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.27596: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853654.27637: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65005580> <<< 29922 1726853654.27751: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 29922 1726853654.27760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 29922 1726853654.27777: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65006990> <<< 29922 1726853654.27790: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e650013a0> <<< 29922 1726853654.28083: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 29922 1726853654.28124: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.28362: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 29922 1726853654.28375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 29922 1726853654.28389: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e650061e0> <<< 29922 1726853654.28395: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.29149: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.29886: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.29996: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.30103: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 29922 1726853654.30110: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.30168: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.30223: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 29922 1726853654.30230: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.30328: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.30451: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 29922 1726853654.30466: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.30479: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 29922 1726853654.30503: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.30559: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.30611: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 29922 1726853654.30614: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.30987: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.31352: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 29922 1726853654.31448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 29922 1726853654.31451: stdout chunk (state=3): >>>import '_ast' # <<< 29922 1726853654.31558: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65007560> <<< 29922 1726853654.31683: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29922 1726853654.31783: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 29922 1726853654.31790: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 29922 1726853654.31828: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.31882: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.31939: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 29922 1726853654.31942: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.32004: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.32062: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.32141: stdout chunk (state=3): >>># zipimport: zlib available <<< 29922 1726853654.32236: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 29922 1726853654.32282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853654.32677: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65012150> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6500f410> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available<<< 29922 1726853654.32684: stdout chunk (state=3): >>> <<< 29922 1726853654.32747: stdout chunk (state=3): >>># zipimport: zlib available<<< 29922 1726853654.32750: stdout chunk (state=3): >>> <<< 29922 1726853654.32816: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 29922 1726853654.32827: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 29922 1726853654.32880: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 29922 1726853654.32905: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 29922 1726853654.32947: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 29922 1726853654.33078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 29922 1726853654.33207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 29922 1726853654.33217: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65102990> <<< 29922 1726853654.33288: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e657c6660><<< 29922 1726853654.33299: stdout chunk (state=3): >>> <<< 29922 1726853654.33445: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65011f10><<< 29922 1726853654.33449: stdout chunk (state=3): >>> <<< 29922 1726853654.33520: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e658871a0> # destroy ansible.module_utils.distro <<< 29922 1726853654.33524: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 29922 1726853654.33599: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 29922 1726853654.33699: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available<<< 29922 1726853654.33704: stdout chunk (state=3): >>> <<< 29922 1726853654.33733: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 29922 1726853654.33755: stdout chunk (state=3): >>> # zipimport: zlib available<<< 29922 1726853654.33890: stdout chunk (state=3): >>> <<< 29922 1726853654.33999: stdout chunk (state=3): >>># zipimport: zlib available<<< 29922 1726853654.34002: stdout chunk (state=3): >>> <<< 29922 1726853654.34320: stdout chunk (state=3): >>># zipimport: zlib available<<< 29922 1726853654.34323: stdout chunk (state=3): >>> <<< 29922 1726853654.34467: stdout chunk (state=3): >>> <<< 29922 1726853654.34489: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 29922 1726853654.34529: stdout chunk (state=3): >>># destroy __main__<<< 29922 1726853654.34533: stdout chunk (state=3): >>> <<< 29922 1726853654.35043: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 29922 1726853654.35064: stdout chunk (state=3): >>># clear sys.path_hooks <<< 29922 1726853654.35085: stdout chunk (state=3): >>># clear builtins._<<< 29922 1726853654.35090: stdout chunk (state=3): >>> # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc<<< 29922 1726853654.35113: stdout chunk (state=3): >>> # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout<<< 29922 1726853654.35131: stdout chunk (state=3): >>> # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal<<< 29922 1726853654.35163: stdout chunk (state=3): >>> # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat<<< 29922 1726853654.35195: stdout chunk (state=3): >>> # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword<<< 29922 1726853654.35237: stdout chunk (state=3): >>> # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings <<< 29922 1726853654.35262: stdout chunk (state=3): >>># cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random<<< 29922 1726853654.35295: stdout chunk (state=3): >>> # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib<<< 29922 1726853654.35314: stdout chunk (state=3): >>> # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible<<< 29922 1726853654.35344: stdout chunk (state=3): >>> # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl<<< 29922 1726853654.35365: stdout chunk (state=3): >>> # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache<<< 29922 1726853654.35391: stdout chunk (state=3): >>> # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string<<< 29922 1726853654.35412: stdout chunk (state=3): >>> # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon<<< 29922 1726853654.35438: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters<<< 29922 1726853654.35462: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors<<< 29922 1726853654.35486: stdout chunk (state=3): >>> # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters<<< 29922 1726853654.35505: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux<<< 29922 1726853654.35549: stdout chunk (state=3): >>> # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils <<< 29922 1726853654.35602: stdout chunk (state=3): >>># destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 29922 1726853654.36104: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath <<< 29922 1726853654.36116: stdout chunk (state=3): >>># destroy importlib <<< 29922 1726853654.36141: stdout chunk (state=3): >>># destroy zipimport # destroy __main__<<< 29922 1726853654.36168: stdout chunk (state=3): >>> # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder<<< 29922 1726853654.36211: stdout chunk (state=3): >>> # destroy json.encoder # destroy json.scanner # destroy _json<<< 29922 1726853654.36241: stdout chunk (state=3): >>> # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale<<< 29922 1726853654.36250: stdout chunk (state=3): >>> # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess <<< 29922 1726853654.36329: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selectors <<< 29922 1726853654.36335: stdout chunk (state=3): >>># destroy errno # destroy array<<< 29922 1726853654.36366: stdout chunk (state=3): >>> # destroy datetime # destroy selinux # destroy shutil<<< 29922 1726853654.36398: stdout chunk (state=3): >>> # destroy distro<<< 29922 1726853654.36409: stdout chunk (state=3): >>> # destroy distro.distro<<< 29922 1726853654.36477: stdout chunk (state=3): >>> # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux <<< 29922 1726853654.36530: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 29922 1726853654.36562: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 29922 1726853654.36565: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128<<< 29922 1726853654.36607: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize<<< 29922 1726853654.36611: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 29922 1726853654.36667: stdout chunk (state=3): >>> # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib<<< 29922 1726853654.36692: stdout chunk (state=3): >>> # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix<<< 29922 1726853654.36777: stdout chunk (state=3): >>> # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 29922 1726853654.36781: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc<<< 29922 1726853654.36785: stdout chunk (state=3): >>> # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig<<< 29922 1726853654.36807: stdout chunk (state=3): >>> # cleanup[3] wiping os # destroy posixpath<<< 29922 1726853654.36858: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc<<< 29922 1726853654.36862: stdout chunk (state=3): >>> # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref<<< 29922 1726853654.36864: stdout chunk (state=3): >>> # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins<<< 29922 1726853654.36866: stdout chunk (state=3): >>> # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader<<< 29922 1726853654.37084: stdout chunk (state=3): >>> # destroy systemd._journal # destroy _datetime # destroy sys.monitoring <<< 29922 1726853654.37092: stdout chunk (state=3): >>># destroy _socket<<< 29922 1726853654.37116: stdout chunk (state=3): >>> # destroy _collections <<< 29922 1726853654.37156: stdout chunk (state=3): >>># destroy platform<<< 29922 1726853654.37188: stdout chunk (state=3): >>> # destroy _uuid <<< 29922 1726853654.37226: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser <<< 29922 1726853654.37261: stdout chunk (state=3): >>># destroy tokenize <<< 29922 1726853654.37298: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib<<< 29922 1726853654.37317: stdout chunk (state=3): >>> <<< 29922 1726853654.37322: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 29922 1726853654.37361: stdout chunk (state=3): >>># destroy _typing <<< 29922 1726853654.37408: stdout chunk (state=3): >>># destroy _tokenize <<< 29922 1726853654.37411: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse <<< 29922 1726853654.37490: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 29922 1726853654.37566: stdout chunk (state=3): >>># destroy codecs <<< 29922 1726853654.37618: stdout chunk (state=3): >>># destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig <<< 29922 1726853654.37648: stdout chunk (state=3): >>># destroy encodings.cp437 # destroy _codecs <<< 29922 1726853654.37658: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit<<< 29922 1726853654.37690: stdout chunk (state=3): >>> # destroy _warnings # destroy math # destroy _bisect # destroy time<<< 29922 1726853654.37725: stdout chunk (state=3): >>> # destroy _random<<< 29922 1726853654.37739: stdout chunk (state=3): >>> # destroy _weakref<<< 29922 1726853654.37770: stdout chunk (state=3): >>> # destroy _hashlib<<< 29922 1726853654.37802: stdout chunk (state=3): >>> # destroy _operator # destroy _string<<< 29922 1726853654.37823: stdout chunk (state=3): >>> # destroy re # destroy itertools <<< 29922 1726853654.37857: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 29922 1726853654.37881: stdout chunk (state=3): >>> # clear sys.audit hooks<<< 29922 1726853654.37984: stdout chunk (state=3): >>> <<< 29922 1726853654.38411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853654.38414: stdout chunk (state=3): >>><<< 29922 1726853654.38416: stderr chunk (state=3): >>><<< 29922 1726853654.38589: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65c184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65be7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65c1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e659c9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e659c9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a07e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a07f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a3f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a3ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a1fb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a1d280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a05040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a5f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a5e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a1e150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a5cc80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a94890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a042c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65a94d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a94bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65a94fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a02de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a956d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a953a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a965d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65aac7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65aadeb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65aaed50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65aaf380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65aae2a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65aafe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65aaf530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a96570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65833ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6585c830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6585c590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6585c770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6585d100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6585daf0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6585c9b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65831e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6585eea0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6585dc10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65a96cc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65887230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e658ab590> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6590c2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6590ea50> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6590c410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e658d13a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e657113d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e658aa3c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6585fe00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8e65711670> # zipimport: found 30 names in '/tmp/ansible_stat_payload_ntytxjpd/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e657670b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65745fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65745160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65764f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6578eae0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6578e870> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6578e180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6578e5d0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65767d40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6578f800> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6578fa40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6578ff80> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65115d00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65117920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65118290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65119430> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6511be60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65a02ed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6511a120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65123d10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e651227e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65122540> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65122ab0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6511a630> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6516bfb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6516c0b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6516db80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6516d940> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e6516ffe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6516e1b0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65173830> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65170200> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e651745c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65174a40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e651749e0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6516c200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65000170> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65001280> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65176900> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65177cb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65176540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65005580> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65006990> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e650013a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e650061e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65007560> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8e65012150> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e6500f410> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65102990> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e657c6660> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e65011f10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8e658871a0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 29922 1726853654.39606: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853653.6531527-30037-12760472044349/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853654.39609: _low_level_execute_command(): starting 29922 1726853654.39612: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853653.6531527-30037-12760472044349/ > /dev/null 2>&1 && sleep 0' 29922 1726853654.39875: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853654.39892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853654.39908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853654.39928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853654.39953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853654.39967: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853654.40073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853654.40109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853654.40219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853654.42941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853654.42953: stdout chunk (state=3): >>><<< 29922 1726853654.42972: stderr chunk (state=3): >>><<< 29922 1726853654.43090: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853654.43094: handler run complete 29922 1726853654.43096: attempt loop complete, returning result 29922 1726853654.43100: _execute() done 29922 1726853654.43103: dumping result to json 29922 1726853654.43105: done dumping result, returning 29922 1726853654.43107: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [02083763-bbaf-51d4-513b-0000000000c2] 29922 1726853654.43109: sending task result for task 02083763-bbaf-51d4-513b-0000000000c2 ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 29922 1726853654.43279: no more pending results, returning what we have 29922 1726853654.43282: results queue empty 29922 1726853654.43283: checking for any_errors_fatal 29922 1726853654.43291: done checking for any_errors_fatal 29922 1726853654.43292: checking for max_fail_percentage 29922 1726853654.43293: done checking for max_fail_percentage 29922 1726853654.43294: checking to see if all hosts have failed and the running result is not ok 29922 1726853654.43295: done checking to see if all hosts have failed 29922 1726853654.43296: getting the remaining hosts for this loop 29922 1726853654.43297: done getting the remaining hosts for this loop 29922 1726853654.43302: getting the next task for host managed_node3 29922 1726853654.43308: done getting next task for host managed_node3 29922 1726853654.43311: ^ task is: TASK: Set flag to indicate system is ostree 29922 1726853654.43314: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853654.43318: getting variables 29922 1726853654.43319: in VariableManager get_vars() 29922 1726853654.43353: Calling all_inventory to load vars for managed_node3 29922 1726853654.43355: Calling groups_inventory to load vars for managed_node3 29922 1726853654.43362: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853654.43614: Calling all_plugins_play to load vars for managed_node3 29922 1726853654.43619: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853654.43624: Calling groups_plugins_play to load vars for managed_node3 29922 1726853654.44031: done sending task result for task 02083763-bbaf-51d4-513b-0000000000c2 29922 1726853654.44035: WORKER PROCESS EXITING 29922 1726853654.44062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853654.44362: done with get_vars() 29922 1726853654.44375: done getting variables 29922 1726853654.44479: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:34:14 -0400 (0:00:00.894) 0:00:03.374 ****** 29922 1726853654.44509: entering _queue_task() for managed_node3/set_fact 29922 1726853654.44511: Creating lock for set_fact 29922 1726853654.44861: worker is 1 (out of 1 available) 29922 1726853654.44877: exiting _queue_task() for managed_node3/set_fact 29922 1726853654.44898: done queuing things up, now waiting for results queue to drain 29922 1726853654.44900: waiting for pending results... 29922 1726853654.45191: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 29922 1726853654.45269: in run() - task 02083763-bbaf-51d4-513b-0000000000c3 29922 1726853654.45282: variable 'ansible_search_path' from source: unknown 29922 1726853654.45286: variable 'ansible_search_path' from source: unknown 29922 1726853654.45319: calling self._execute() 29922 1726853654.45387: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.45392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.45402: variable 'omit' from source: magic vars 29922 1726853654.46011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853654.46294: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853654.46334: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853654.46366: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853654.46409: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853654.46491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853654.46514: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853654.46538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853654.46563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853654.46680: Evaluated conditional (not __network_is_ostree is defined): True 29922 1726853654.46686: variable 'omit' from source: magic vars 29922 1726853654.46747: variable 'omit' from source: magic vars 29922 1726853654.46900: variable '__ostree_booted_stat' from source: set_fact 29922 1726853654.46915: variable 'omit' from source: magic vars 29922 1726853654.46934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853654.46969: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853654.47002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853654.47021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853654.47031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853654.47054: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853654.47060: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.47068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.47142: Set connection var ansible_connection to ssh 29922 1726853654.47148: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853654.47155: Set connection var ansible_shell_executable to /bin/sh 29922 1726853654.47163: Set connection var ansible_pipelining to False 29922 1726853654.47168: Set connection var ansible_timeout to 10 29922 1726853654.47172: Set connection var ansible_shell_type to sh 29922 1726853654.47190: variable 'ansible_shell_executable' from source: unknown 29922 1726853654.47193: variable 'ansible_connection' from source: unknown 29922 1726853654.47195: variable 'ansible_module_compression' from source: unknown 29922 1726853654.47199: variable 'ansible_shell_type' from source: unknown 29922 1726853654.47202: variable 'ansible_shell_executable' from source: unknown 29922 1726853654.47204: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.47212: variable 'ansible_pipelining' from source: unknown 29922 1726853654.47214: variable 'ansible_timeout' from source: unknown 29922 1726853654.47222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.47288: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853654.47296: variable 'omit' from source: magic vars 29922 1726853654.47301: starting attempt loop 29922 1726853654.47304: running the handler 29922 1726853654.47316: handler run complete 29922 1726853654.47327: attempt loop complete, returning result 29922 1726853654.47330: _execute() done 29922 1726853654.47333: dumping result to json 29922 1726853654.47335: done dumping result, returning 29922 1726853654.47340: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [02083763-bbaf-51d4-513b-0000000000c3] 29922 1726853654.47345: sending task result for task 02083763-bbaf-51d4-513b-0000000000c3 29922 1726853654.47419: done sending task result for task 02083763-bbaf-51d4-513b-0000000000c3 29922 1726853654.47428: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 29922 1726853654.47481: no more pending results, returning what we have 29922 1726853654.47484: results queue empty 29922 1726853654.47485: checking for any_errors_fatal 29922 1726853654.47492: done checking for any_errors_fatal 29922 1726853654.47492: checking for max_fail_percentage 29922 1726853654.47494: done checking for max_fail_percentage 29922 1726853654.47495: checking to see if all hosts have failed and the running result is not ok 29922 1726853654.47496: done checking to see if all hosts have failed 29922 1726853654.47496: getting the remaining hosts for this loop 29922 1726853654.47498: done getting the remaining hosts for this loop 29922 1726853654.47501: getting the next task for host managed_node3 29922 1726853654.47509: done getting next task for host managed_node3 29922 1726853654.47511: ^ task is: TASK: Fix CentOS6 Base repo 29922 1726853654.47514: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853654.47518: getting variables 29922 1726853654.47519: in VariableManager get_vars() 29922 1726853654.47546: Calling all_inventory to load vars for managed_node3 29922 1726853654.47549: Calling groups_inventory to load vars for managed_node3 29922 1726853654.47552: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853654.47563: Calling all_plugins_play to load vars for managed_node3 29922 1726853654.47566: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853654.47576: Calling groups_plugins_play to load vars for managed_node3 29922 1726853654.47743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853654.47864: done with get_vars() 29922 1726853654.47872: done getting variables 29922 1726853654.47969: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:34:14 -0400 (0:00:00.034) 0:00:03.409 ****** 29922 1726853654.48000: entering _queue_task() for managed_node3/copy 29922 1726853654.48266: worker is 1 (out of 1 available) 29922 1726853654.48281: exiting _queue_task() for managed_node3/copy 29922 1726853654.48294: done queuing things up, now waiting for results queue to drain 29922 1726853654.48296: waiting for pending results... 29922 1726853654.48505: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 29922 1726853654.48556: in run() - task 02083763-bbaf-51d4-513b-0000000000c5 29922 1726853654.48569: variable 'ansible_search_path' from source: unknown 29922 1726853654.48574: variable 'ansible_search_path' from source: unknown 29922 1726853654.48607: calling self._execute() 29922 1726853654.48682: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.48687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.48698: variable 'omit' from source: magic vars 29922 1726853654.49201: variable 'ansible_distribution' from source: facts 29922 1726853654.49213: Evaluated conditional (ansible_distribution == 'CentOS'): True 29922 1726853654.49326: variable 'ansible_distribution_major_version' from source: facts 29922 1726853654.49332: Evaluated conditional (ansible_distribution_major_version == '6'): False 29922 1726853654.49334: when evaluation is False, skipping this task 29922 1726853654.49337: _execute() done 29922 1726853654.49339: dumping result to json 29922 1726853654.49342: done dumping result, returning 29922 1726853654.49351: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [02083763-bbaf-51d4-513b-0000000000c5] 29922 1726853654.49355: sending task result for task 02083763-bbaf-51d4-513b-0000000000c5 29922 1726853654.49632: done sending task result for task 02083763-bbaf-51d4-513b-0000000000c5 29922 1726853654.49635: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 29922 1726853654.49698: no more pending results, returning what we have 29922 1726853654.49701: results queue empty 29922 1726853654.49702: checking for any_errors_fatal 29922 1726853654.49705: done checking for any_errors_fatal 29922 1726853654.49706: checking for max_fail_percentage 29922 1726853654.49707: done checking for max_fail_percentage 29922 1726853654.49708: checking to see if all hosts have failed and the running result is not ok 29922 1726853654.49709: done checking to see if all hosts have failed 29922 1726853654.49709: getting the remaining hosts for this loop 29922 1726853654.49711: done getting the remaining hosts for this loop 29922 1726853654.49714: getting the next task for host managed_node3 29922 1726853654.49718: done getting next task for host managed_node3 29922 1726853654.49721: ^ task is: TASK: Include the task 'enable_epel.yml' 29922 1726853654.49723: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853654.49727: getting variables 29922 1726853654.49728: in VariableManager get_vars() 29922 1726853654.49781: Calling all_inventory to load vars for managed_node3 29922 1726853654.49784: Calling groups_inventory to load vars for managed_node3 29922 1726853654.49787: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853654.49796: Calling all_plugins_play to load vars for managed_node3 29922 1726853654.49799: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853654.49802: Calling groups_plugins_play to load vars for managed_node3 29922 1726853654.50199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853654.50604: done with get_vars() 29922 1726853654.50613: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:34:14 -0400 (0:00:00.026) 0:00:03.436 ****** 29922 1726853654.50701: entering _queue_task() for managed_node3/include_tasks 29922 1726853654.51082: worker is 1 (out of 1 available) 29922 1726853654.51093: exiting _queue_task() for managed_node3/include_tasks 29922 1726853654.51103: done queuing things up, now waiting for results queue to drain 29922 1726853654.51104: waiting for pending results... 29922 1726853654.51294: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 29922 1726853654.51373: in run() - task 02083763-bbaf-51d4-513b-0000000000c6 29922 1726853654.51439: variable 'ansible_search_path' from source: unknown 29922 1726853654.51442: variable 'ansible_search_path' from source: unknown 29922 1726853654.51445: calling self._execute() 29922 1726853654.51534: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.51549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.51565: variable 'omit' from source: magic vars 29922 1726853654.52100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853654.55504: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853654.55635: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853654.55639: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853654.55692: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853654.55724: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853654.55819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853654.55868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853654.55964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853654.55968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853654.55973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853654.56099: variable '__network_is_ostree' from source: set_fact 29922 1726853654.56123: Evaluated conditional (not __network_is_ostree | d(false)): True 29922 1726853654.56135: _execute() done 29922 1726853654.56142: dumping result to json 29922 1726853654.56149: done dumping result, returning 29922 1726853654.56163: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [02083763-bbaf-51d4-513b-0000000000c6] 29922 1726853654.56183: sending task result for task 02083763-bbaf-51d4-513b-0000000000c6 29922 1726853654.56411: no more pending results, returning what we have 29922 1726853654.56417: in VariableManager get_vars() 29922 1726853654.56452: Calling all_inventory to load vars for managed_node3 29922 1726853654.56455: Calling groups_inventory to load vars for managed_node3 29922 1726853654.56461: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853654.56475: Calling all_plugins_play to load vars for managed_node3 29922 1726853654.56481: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853654.56485: Calling groups_plugins_play to load vars for managed_node3 29922 1726853654.57180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853654.57599: done with get_vars() 29922 1726853654.57616: variable 'ansible_search_path' from source: unknown 29922 1726853654.57617: variable 'ansible_search_path' from source: unknown 29922 1726853654.57750: we have included files to process 29922 1726853654.57751: generating all_blocks data 29922 1726853654.57753: done generating all_blocks data 29922 1726853654.57761: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 29922 1726853654.57762: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 29922 1726853654.57765: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 29922 1726853654.57777: done sending task result for task 02083763-bbaf-51d4-513b-0000000000c6 29922 1726853654.57780: WORKER PROCESS EXITING 29922 1726853654.59112: done processing included file 29922 1726853654.59114: iterating over new_blocks loaded from include file 29922 1726853654.59115: in VariableManager get_vars() 29922 1726853654.59126: done with get_vars() 29922 1726853654.59128: filtering new block on tags 29922 1726853654.59269: done filtering new block on tags 29922 1726853654.59275: in VariableManager get_vars() 29922 1726853654.59286: done with get_vars() 29922 1726853654.59287: filtering new block on tags 29922 1726853654.59299: done filtering new block on tags 29922 1726853654.59301: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 29922 1726853654.59306: extending task lists for all hosts with included blocks 29922 1726853654.59526: done extending task lists 29922 1726853654.59527: done processing included files 29922 1726853654.59528: results queue empty 29922 1726853654.59529: checking for any_errors_fatal 29922 1726853654.59532: done checking for any_errors_fatal 29922 1726853654.59533: checking for max_fail_percentage 29922 1726853654.59534: done checking for max_fail_percentage 29922 1726853654.59535: checking to see if all hosts have failed and the running result is not ok 29922 1726853654.59535: done checking to see if all hosts have failed 29922 1726853654.59536: getting the remaining hosts for this loop 29922 1726853654.59537: done getting the remaining hosts for this loop 29922 1726853654.59539: getting the next task for host managed_node3 29922 1726853654.59543: done getting next task for host managed_node3 29922 1726853654.59546: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 29922 1726853654.59548: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853654.59551: getting variables 29922 1726853654.59552: in VariableManager get_vars() 29922 1726853654.59562: Calling all_inventory to load vars for managed_node3 29922 1726853654.59565: Calling groups_inventory to load vars for managed_node3 29922 1726853654.59677: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853654.59688: Calling all_plugins_play to load vars for managed_node3 29922 1726853654.59696: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853654.59700: Calling groups_plugins_play to load vars for managed_node3 29922 1726853654.60351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853654.60580: done with get_vars() 29922 1726853654.60589: done getting variables 29922 1726853654.60669: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 29922 1726853654.60887: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:34:14 -0400 (0:00:00.102) 0:00:03.538 ****** 29922 1726853654.60934: entering _queue_task() for managed_node3/command 29922 1726853654.60936: Creating lock for command 29922 1726853654.61506: worker is 1 (out of 1 available) 29922 1726853654.61519: exiting _queue_task() for managed_node3/command 29922 1726853654.61533: done queuing things up, now waiting for results queue to drain 29922 1726853654.61534: waiting for pending results... 29922 1726853654.61834: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 29922 1726853654.61962: in run() - task 02083763-bbaf-51d4-513b-0000000000e0 29922 1726853654.61988: variable 'ansible_search_path' from source: unknown 29922 1726853654.62003: variable 'ansible_search_path' from source: unknown 29922 1726853654.62040: calling self._execute() 29922 1726853654.62128: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.62143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.62176: variable 'omit' from source: magic vars 29922 1726853654.63196: variable 'ansible_distribution' from source: facts 29922 1726853654.63200: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 29922 1726853654.63410: variable 'ansible_distribution_major_version' from source: facts 29922 1726853654.63430: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 29922 1726853654.63651: when evaluation is False, skipping this task 29922 1726853654.63655: _execute() done 29922 1726853654.63661: dumping result to json 29922 1726853654.63664: done dumping result, returning 29922 1726853654.63666: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [02083763-bbaf-51d4-513b-0000000000e0] 29922 1726853654.63669: sending task result for task 02083763-bbaf-51d4-513b-0000000000e0 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 29922 1726853654.63816: no more pending results, returning what we have 29922 1726853654.63820: results queue empty 29922 1726853654.63820: checking for any_errors_fatal 29922 1726853654.63822: done checking for any_errors_fatal 29922 1726853654.63822: checking for max_fail_percentage 29922 1726853654.63824: done checking for max_fail_percentage 29922 1726853654.63825: checking to see if all hosts have failed and the running result is not ok 29922 1726853654.63826: done checking to see if all hosts have failed 29922 1726853654.63826: getting the remaining hosts for this loop 29922 1726853654.63828: done getting the remaining hosts for this loop 29922 1726853654.63831: getting the next task for host managed_node3 29922 1726853654.63838: done getting next task for host managed_node3 29922 1726853654.63841: ^ task is: TASK: Install yum-utils package 29922 1726853654.63845: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853654.63849: getting variables 29922 1726853654.63851: in VariableManager get_vars() 29922 1726853654.63994: Calling all_inventory to load vars for managed_node3 29922 1726853654.63997: Calling groups_inventory to load vars for managed_node3 29922 1726853654.64001: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853654.64016: Calling all_plugins_play to load vars for managed_node3 29922 1726853654.64020: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853654.64024: Calling groups_plugins_play to load vars for managed_node3 29922 1726853654.64805: done sending task result for task 02083763-bbaf-51d4-513b-0000000000e0 29922 1726853654.64809: WORKER PROCESS EXITING 29922 1726853654.64845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853654.65134: done with get_vars() 29922 1726853654.65152: done getting variables 29922 1726853654.65270: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:34:14 -0400 (0:00:00.043) 0:00:03.582 ****** 29922 1726853654.65302: entering _queue_task() for managed_node3/package 29922 1726853654.65304: Creating lock for package 29922 1726853654.65722: worker is 1 (out of 1 available) 29922 1726853654.65735: exiting _queue_task() for managed_node3/package 29922 1726853654.65745: done queuing things up, now waiting for results queue to drain 29922 1726853654.65747: waiting for pending results... 29922 1726853654.65917: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 29922 1726853654.66048: in run() - task 02083763-bbaf-51d4-513b-0000000000e1 29922 1726853654.66075: variable 'ansible_search_path' from source: unknown 29922 1726853654.66088: variable 'ansible_search_path' from source: unknown 29922 1726853654.66125: calling self._execute() 29922 1726853654.66220: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.66235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.66260: variable 'omit' from source: magic vars 29922 1726853654.66696: variable 'ansible_distribution' from source: facts 29922 1726853654.66714: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 29922 1726853654.66862: variable 'ansible_distribution_major_version' from source: facts 29922 1726853654.66903: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 29922 1726853654.66906: when evaluation is False, skipping this task 29922 1726853654.66908: _execute() done 29922 1726853654.66911: dumping result to json 29922 1726853654.66912: done dumping result, returning 29922 1726853654.66915: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [02083763-bbaf-51d4-513b-0000000000e1] 29922 1726853654.66917: sending task result for task 02083763-bbaf-51d4-513b-0000000000e1 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 29922 1726853654.67212: no more pending results, returning what we have 29922 1726853654.67215: results queue empty 29922 1726853654.67216: checking for any_errors_fatal 29922 1726853654.67229: done checking for any_errors_fatal 29922 1726853654.67230: checking for max_fail_percentage 29922 1726853654.67232: done checking for max_fail_percentage 29922 1726853654.67232: checking to see if all hosts have failed and the running result is not ok 29922 1726853654.67233: done checking to see if all hosts have failed 29922 1726853654.67234: getting the remaining hosts for this loop 29922 1726853654.67236: done getting the remaining hosts for this loop 29922 1726853654.67239: getting the next task for host managed_node3 29922 1726853654.67247: done getting next task for host managed_node3 29922 1726853654.67250: ^ task is: TASK: Enable EPEL 7 29922 1726853654.67255: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853654.67262: getting variables 29922 1726853654.67263: in VariableManager get_vars() 29922 1726853654.67368: Calling all_inventory to load vars for managed_node3 29922 1726853654.67449: Calling groups_inventory to load vars for managed_node3 29922 1726853654.67453: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853654.67468: Calling all_plugins_play to load vars for managed_node3 29922 1726853654.67473: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853654.67477: Calling groups_plugins_play to load vars for managed_node3 29922 1726853654.67724: done sending task result for task 02083763-bbaf-51d4-513b-0000000000e1 29922 1726853654.67727: WORKER PROCESS EXITING 29922 1726853654.67752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853654.67996: done with get_vars() 29922 1726853654.68007: done getting variables 29922 1726853654.68080: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:34:14 -0400 (0:00:00.028) 0:00:03.610 ****** 29922 1726853654.68112: entering _queue_task() for managed_node3/command 29922 1726853654.68438: worker is 1 (out of 1 available) 29922 1726853654.68449: exiting _queue_task() for managed_node3/command 29922 1726853654.68463: done queuing things up, now waiting for results queue to drain 29922 1726853654.68464: waiting for pending results... 29922 1726853654.68699: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 29922 1726853654.68870: in run() - task 02083763-bbaf-51d4-513b-0000000000e2 29922 1726853654.68877: variable 'ansible_search_path' from source: unknown 29922 1726853654.68880: variable 'ansible_search_path' from source: unknown 29922 1726853654.68902: calling self._execute() 29922 1726853654.68997: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.69009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.69040: variable 'omit' from source: magic vars 29922 1726853654.69433: variable 'ansible_distribution' from source: facts 29922 1726853654.69478: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 29922 1726853654.69744: variable 'ansible_distribution_major_version' from source: facts 29922 1726853654.69747: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 29922 1726853654.69749: when evaluation is False, skipping this task 29922 1726853654.69752: _execute() done 29922 1726853654.69754: dumping result to json 29922 1726853654.69756: done dumping result, returning 29922 1726853654.69762: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [02083763-bbaf-51d4-513b-0000000000e2] 29922 1726853654.69775: sending task result for task 02083763-bbaf-51d4-513b-0000000000e2 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 29922 1726853654.69948: no more pending results, returning what we have 29922 1726853654.69959: results queue empty 29922 1726853654.69960: checking for any_errors_fatal 29922 1726853654.69968: done checking for any_errors_fatal 29922 1726853654.69968: checking for max_fail_percentage 29922 1726853654.69970: done checking for max_fail_percentage 29922 1726853654.69972: checking to see if all hosts have failed and the running result is not ok 29922 1726853654.69973: done checking to see if all hosts have failed 29922 1726853654.69974: getting the remaining hosts for this loop 29922 1726853654.69975: done getting the remaining hosts for this loop 29922 1726853654.69979: getting the next task for host managed_node3 29922 1726853654.69985: done getting next task for host managed_node3 29922 1726853654.69988: ^ task is: TASK: Enable EPEL 8 29922 1726853654.69992: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853654.69997: getting variables 29922 1726853654.69998: in VariableManager get_vars() 29922 1726853654.70026: Calling all_inventory to load vars for managed_node3 29922 1726853654.70028: Calling groups_inventory to load vars for managed_node3 29922 1726853654.70032: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853654.70046: Calling all_plugins_play to load vars for managed_node3 29922 1726853654.70049: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853654.70053: Calling groups_plugins_play to load vars for managed_node3 29922 1726853654.70399: done sending task result for task 02083763-bbaf-51d4-513b-0000000000e2 29922 1726853654.70403: WORKER PROCESS EXITING 29922 1726853654.70428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853654.70659: done with get_vars() 29922 1726853654.70670: done getting variables 29922 1726853654.70739: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:34:14 -0400 (0:00:00.026) 0:00:03.637 ****** 29922 1726853654.70774: entering _queue_task() for managed_node3/command 29922 1726853654.71167: worker is 1 (out of 1 available) 29922 1726853654.71182: exiting _queue_task() for managed_node3/command 29922 1726853654.71193: done queuing things up, now waiting for results queue to drain 29922 1726853654.71195: waiting for pending results... 29922 1726853654.71364: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 29922 1726853654.71487: in run() - task 02083763-bbaf-51d4-513b-0000000000e3 29922 1726853654.71505: variable 'ansible_search_path' from source: unknown 29922 1726853654.71513: variable 'ansible_search_path' from source: unknown 29922 1726853654.71561: calling self._execute() 29922 1726853654.71656: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.71675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.71695: variable 'omit' from source: magic vars 29922 1726853654.72113: variable 'ansible_distribution' from source: facts 29922 1726853654.72187: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 29922 1726853654.72278: variable 'ansible_distribution_major_version' from source: facts 29922 1726853654.72295: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 29922 1726853654.72304: when evaluation is False, skipping this task 29922 1726853654.72311: _execute() done 29922 1726853654.72318: dumping result to json 29922 1726853654.72326: done dumping result, returning 29922 1726853654.72337: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [02083763-bbaf-51d4-513b-0000000000e3] 29922 1726853654.72352: sending task result for task 02083763-bbaf-51d4-513b-0000000000e3 29922 1726853654.72567: done sending task result for task 02083763-bbaf-51d4-513b-0000000000e3 29922 1726853654.72572: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 29922 1726853654.72627: no more pending results, returning what we have 29922 1726853654.72631: results queue empty 29922 1726853654.72631: checking for any_errors_fatal 29922 1726853654.72637: done checking for any_errors_fatal 29922 1726853654.72638: checking for max_fail_percentage 29922 1726853654.72640: done checking for max_fail_percentage 29922 1726853654.72640: checking to see if all hosts have failed and the running result is not ok 29922 1726853654.72641: done checking to see if all hosts have failed 29922 1726853654.72642: getting the remaining hosts for this loop 29922 1726853654.72644: done getting the remaining hosts for this loop 29922 1726853654.72647: getting the next task for host managed_node3 29922 1726853654.72656: done getting next task for host managed_node3 29922 1726853654.72661: ^ task is: TASK: Enable EPEL 6 29922 1726853654.72666: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853654.72731: getting variables 29922 1726853654.72733: in VariableManager get_vars() 29922 1726853654.72756: Calling all_inventory to load vars for managed_node3 29922 1726853654.72761: Calling groups_inventory to load vars for managed_node3 29922 1726853654.72763: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853654.72774: Calling all_plugins_play to load vars for managed_node3 29922 1726853654.72840: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853654.72846: Calling groups_plugins_play to load vars for managed_node3 29922 1726853654.73041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853654.73252: done with get_vars() 29922 1726853654.73263: done getting variables 29922 1726853654.73328: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:34:14 -0400 (0:00:00.025) 0:00:03.663 ****** 29922 1726853654.73355: entering _queue_task() for managed_node3/copy 29922 1726853654.73607: worker is 1 (out of 1 available) 29922 1726853654.73619: exiting _queue_task() for managed_node3/copy 29922 1726853654.73744: done queuing things up, now waiting for results queue to drain 29922 1726853654.73746: waiting for pending results... 29922 1726853654.73975: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 29922 1726853654.74013: in run() - task 02083763-bbaf-51d4-513b-0000000000e5 29922 1726853654.74029: variable 'ansible_search_path' from source: unknown 29922 1726853654.74035: variable 'ansible_search_path' from source: unknown 29922 1726853654.74088: calling self._execute() 29922 1726853654.74180: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.74194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.74211: variable 'omit' from source: magic vars 29922 1726853654.74683: variable 'ansible_distribution' from source: facts 29922 1726853654.74699: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 29922 1726853654.74832: variable 'ansible_distribution_major_version' from source: facts 29922 1726853654.74846: Evaluated conditional (ansible_distribution_major_version == '6'): False 29922 1726853654.74853: when evaluation is False, skipping this task 29922 1726853654.74862: _execute() done 29922 1726853654.74868: dumping result to json 29922 1726853654.74877: done dumping result, returning 29922 1726853654.74887: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [02083763-bbaf-51d4-513b-0000000000e5] 29922 1726853654.74940: sending task result for task 02083763-bbaf-51d4-513b-0000000000e5 29922 1726853654.75014: done sending task result for task 02083763-bbaf-51d4-513b-0000000000e5 29922 1726853654.75017: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 29922 1726853654.75072: no more pending results, returning what we have 29922 1726853654.75076: results queue empty 29922 1726853654.75077: checking for any_errors_fatal 29922 1726853654.75083: done checking for any_errors_fatal 29922 1726853654.75083: checking for max_fail_percentage 29922 1726853654.75085: done checking for max_fail_percentage 29922 1726853654.75086: checking to see if all hosts have failed and the running result is not ok 29922 1726853654.75086: done checking to see if all hosts have failed 29922 1726853654.75087: getting the remaining hosts for this loop 29922 1726853654.75089: done getting the remaining hosts for this loop 29922 1726853654.75092: getting the next task for host managed_node3 29922 1726853654.75100: done getting next task for host managed_node3 29922 1726853654.75103: ^ task is: TASK: Set network provider to 'nm' 29922 1726853654.75106: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853654.75110: getting variables 29922 1726853654.75111: in VariableManager get_vars() 29922 1726853654.75139: Calling all_inventory to load vars for managed_node3 29922 1726853654.75141: Calling groups_inventory to load vars for managed_node3 29922 1726853654.75144: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853654.75276: Calling all_plugins_play to load vars for managed_node3 29922 1726853654.75280: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853654.75284: Calling groups_plugins_play to load vars for managed_node3 29922 1726853654.75633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853654.75864: done with get_vars() 29922 1726853654.75874: done getting variables 29922 1726853654.75937: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:13 Friday 20 September 2024 13:34:14 -0400 (0:00:00.026) 0:00:03.689 ****** 29922 1726853654.75964: entering _queue_task() for managed_node3/set_fact 29922 1726853654.76236: worker is 1 (out of 1 available) 29922 1726853654.76363: exiting _queue_task() for managed_node3/set_fact 29922 1726853654.76376: done queuing things up, now waiting for results queue to drain 29922 1726853654.76377: waiting for pending results... 29922 1726853654.76698: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 29922 1726853654.76704: in run() - task 02083763-bbaf-51d4-513b-000000000007 29922 1726853654.76707: variable 'ansible_search_path' from source: unknown 29922 1726853654.76710: calling self._execute() 29922 1726853654.76805: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.76818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.76833: variable 'omit' from source: magic vars 29922 1726853654.76961: variable 'omit' from source: magic vars 29922 1726853654.77011: variable 'omit' from source: magic vars 29922 1726853654.77076: variable 'omit' from source: magic vars 29922 1726853654.77103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853654.77161: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853654.77227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853654.77230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853654.77244: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853654.77285: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853654.77295: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.77335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.77423: Set connection var ansible_connection to ssh 29922 1726853654.77447: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853654.77465: Set connection var ansible_shell_executable to /bin/sh 29922 1726853654.77550: Set connection var ansible_pipelining to False 29922 1726853654.77553: Set connection var ansible_timeout to 10 29922 1726853654.77555: Set connection var ansible_shell_type to sh 29922 1726853654.77560: variable 'ansible_shell_executable' from source: unknown 29922 1726853654.77563: variable 'ansible_connection' from source: unknown 29922 1726853654.77565: variable 'ansible_module_compression' from source: unknown 29922 1726853654.77567: variable 'ansible_shell_type' from source: unknown 29922 1726853654.77569: variable 'ansible_shell_executable' from source: unknown 29922 1726853654.77574: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.77576: variable 'ansible_pipelining' from source: unknown 29922 1726853654.77578: variable 'ansible_timeout' from source: unknown 29922 1726853654.77581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.77805: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853654.77810: variable 'omit' from source: magic vars 29922 1726853654.77812: starting attempt loop 29922 1726853654.77815: running the handler 29922 1726853654.77817: handler run complete 29922 1726853654.77819: attempt loop complete, returning result 29922 1726853654.77821: _execute() done 29922 1726853654.77824: dumping result to json 29922 1726853654.77826: done dumping result, returning 29922 1726853654.77828: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [02083763-bbaf-51d4-513b-000000000007] 29922 1726853654.77877: sending task result for task 02083763-bbaf-51d4-513b-000000000007 29922 1726853654.78063: done sending task result for task 02083763-bbaf-51d4-513b-000000000007 29922 1726853654.78066: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 29922 1726853654.78128: no more pending results, returning what we have 29922 1726853654.78132: results queue empty 29922 1726853654.78132: checking for any_errors_fatal 29922 1726853654.78138: done checking for any_errors_fatal 29922 1726853654.78139: checking for max_fail_percentage 29922 1726853654.78141: done checking for max_fail_percentage 29922 1726853654.78141: checking to see if all hosts have failed and the running result is not ok 29922 1726853654.78142: done checking to see if all hosts have failed 29922 1726853654.78143: getting the remaining hosts for this loop 29922 1726853654.78144: done getting the remaining hosts for this loop 29922 1726853654.78148: getting the next task for host managed_node3 29922 1726853654.78156: done getting next task for host managed_node3 29922 1726853654.78161: ^ task is: TASK: meta (flush_handlers) 29922 1726853654.78163: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853654.78168: getting variables 29922 1726853654.78169: in VariableManager get_vars() 29922 1726853654.78202: Calling all_inventory to load vars for managed_node3 29922 1726853654.78205: Calling groups_inventory to load vars for managed_node3 29922 1726853654.78209: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853654.78221: Calling all_plugins_play to load vars for managed_node3 29922 1726853654.78225: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853654.78228: Calling groups_plugins_play to load vars for managed_node3 29922 1726853654.78612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853654.78833: done with get_vars() 29922 1726853654.78844: done getting variables 29922 1726853654.78914: in VariableManager get_vars() 29922 1726853654.78930: Calling all_inventory to load vars for managed_node3 29922 1726853654.78932: Calling groups_inventory to load vars for managed_node3 29922 1726853654.78935: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853654.78939: Calling all_plugins_play to load vars for managed_node3 29922 1726853654.78941: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853654.78944: Calling groups_plugins_play to load vars for managed_node3 29922 1726853654.79132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853654.79320: done with get_vars() 29922 1726853654.79334: done queuing things up, now waiting for results queue to drain 29922 1726853654.79336: results queue empty 29922 1726853654.79337: checking for any_errors_fatal 29922 1726853654.79339: done checking for any_errors_fatal 29922 1726853654.79340: checking for max_fail_percentage 29922 1726853654.79341: done checking for max_fail_percentage 29922 1726853654.79342: checking to see if all hosts have failed and the running result is not ok 29922 1726853654.79342: done checking to see if all hosts have failed 29922 1726853654.79343: getting the remaining hosts for this loop 29922 1726853654.79344: done getting the remaining hosts for this loop 29922 1726853654.79346: getting the next task for host managed_node3 29922 1726853654.79350: done getting next task for host managed_node3 29922 1726853654.79352: ^ task is: TASK: meta (flush_handlers) 29922 1726853654.79354: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853654.79374: getting variables 29922 1726853654.79375: in VariableManager get_vars() 29922 1726853654.79383: Calling all_inventory to load vars for managed_node3 29922 1726853654.79386: Calling groups_inventory to load vars for managed_node3 29922 1726853654.79388: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853654.79392: Calling all_plugins_play to load vars for managed_node3 29922 1726853654.79394: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853654.79397: Calling groups_plugins_play to load vars for managed_node3 29922 1726853654.79538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853654.79733: done with get_vars() 29922 1726853654.79740: done getting variables 29922 1726853654.79788: in VariableManager get_vars() 29922 1726853654.79802: Calling all_inventory to load vars for managed_node3 29922 1726853654.79804: Calling groups_inventory to load vars for managed_node3 29922 1726853654.79807: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853654.79811: Calling all_plugins_play to load vars for managed_node3 29922 1726853654.79813: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853654.79816: Calling groups_plugins_play to load vars for managed_node3 29922 1726853654.79989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853654.80201: done with get_vars() 29922 1726853654.80212: done queuing things up, now waiting for results queue to drain 29922 1726853654.80213: results queue empty 29922 1726853654.80214: checking for any_errors_fatal 29922 1726853654.80215: done checking for any_errors_fatal 29922 1726853654.80216: checking for max_fail_percentage 29922 1726853654.80217: done checking for max_fail_percentage 29922 1726853654.80218: checking to see if all hosts have failed and the running result is not ok 29922 1726853654.80218: done checking to see if all hosts have failed 29922 1726853654.80219: getting the remaining hosts for this loop 29922 1726853654.80220: done getting the remaining hosts for this loop 29922 1726853654.80222: getting the next task for host managed_node3 29922 1726853654.80225: done getting next task for host managed_node3 29922 1726853654.80226: ^ task is: None 29922 1726853654.80227: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853654.80228: done queuing things up, now waiting for results queue to drain 29922 1726853654.80229: results queue empty 29922 1726853654.80230: checking for any_errors_fatal 29922 1726853654.80238: done checking for any_errors_fatal 29922 1726853654.80238: checking for max_fail_percentage 29922 1726853654.80239: done checking for max_fail_percentage 29922 1726853654.80240: checking to see if all hosts have failed and the running result is not ok 29922 1726853654.80241: done checking to see if all hosts have failed 29922 1726853654.80243: getting the next task for host managed_node3 29922 1726853654.80245: done getting next task for host managed_node3 29922 1726853654.80246: ^ task is: None 29922 1726853654.80247: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853654.80292: in VariableManager get_vars() 29922 1726853654.80312: done with get_vars() 29922 1726853654.80317: in VariableManager get_vars() 29922 1726853654.80328: done with get_vars() 29922 1726853654.80332: variable 'omit' from source: magic vars 29922 1726853654.80384: in VariableManager get_vars() 29922 1726853654.80398: done with get_vars() 29922 1726853654.80419: variable 'omit' from source: magic vars PLAY [Test for testing routing rules] ****************************************** 29922 1726853654.80729: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 29922 1726853654.80753: getting the remaining hosts for this loop 29922 1726853654.80754: done getting the remaining hosts for this loop 29922 1726853654.80757: getting the next task for host managed_node3 29922 1726853654.80762: done getting next task for host managed_node3 29922 1726853654.80764: ^ task is: TASK: Gathering Facts 29922 1726853654.80765: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853654.80767: getting variables 29922 1726853654.80768: in VariableManager get_vars() 29922 1726853654.80787: Calling all_inventory to load vars for managed_node3 29922 1726853654.80789: Calling groups_inventory to load vars for managed_node3 29922 1726853654.80791: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853654.80795: Calling all_plugins_play to load vars for managed_node3 29922 1726853654.80808: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853654.80811: Calling groups_plugins_play to load vars for managed_node3 29922 1726853654.80968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853654.81191: done with get_vars() 29922 1726853654.81199: done getting variables 29922 1726853654.81244: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:3 Friday 20 September 2024 13:34:14 -0400 (0:00:00.053) 0:00:03.742 ****** 29922 1726853654.81269: entering _queue_task() for managed_node3/gather_facts 29922 1726853654.81676: worker is 1 (out of 1 available) 29922 1726853654.81686: exiting _queue_task() for managed_node3/gather_facts 29922 1726853654.81697: done queuing things up, now waiting for results queue to drain 29922 1726853654.81698: waiting for pending results... 29922 1726853654.81850: running TaskExecutor() for managed_node3/TASK: Gathering Facts 29922 1726853654.81987: in run() - task 02083763-bbaf-51d4-513b-00000000010b 29922 1726853654.81991: variable 'ansible_search_path' from source: unknown 29922 1726853654.82030: calling self._execute() 29922 1726853654.82111: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.82175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.82203: variable 'omit' from source: magic vars 29922 1726853654.82637: variable 'ansible_distribution_major_version' from source: facts 29922 1726853654.82642: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853654.82645: variable 'omit' from source: magic vars 29922 1726853654.82650: variable 'omit' from source: magic vars 29922 1726853654.82694: variable 'omit' from source: magic vars 29922 1726853654.82735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853654.82785: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853654.82807: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853654.82855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853654.82863: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853654.82882: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853654.82890: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.82896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.83010: Set connection var ansible_connection to ssh 29922 1726853654.83079: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853654.83084: Set connection var ansible_shell_executable to /bin/sh 29922 1726853654.83086: Set connection var ansible_pipelining to False 29922 1726853654.83088: Set connection var ansible_timeout to 10 29922 1726853654.83090: Set connection var ansible_shell_type to sh 29922 1726853654.83097: variable 'ansible_shell_executable' from source: unknown 29922 1726853654.83106: variable 'ansible_connection' from source: unknown 29922 1726853654.83113: variable 'ansible_module_compression' from source: unknown 29922 1726853654.83120: variable 'ansible_shell_type' from source: unknown 29922 1726853654.83176: variable 'ansible_shell_executable' from source: unknown 29922 1726853654.83186: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853654.83189: variable 'ansible_pipelining' from source: unknown 29922 1726853654.83192: variable 'ansible_timeout' from source: unknown 29922 1726853654.83194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853654.83498: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853654.83522: variable 'omit' from source: magic vars 29922 1726853654.83538: starting attempt loop 29922 1726853654.83546: running the handler 29922 1726853654.83573: variable 'ansible_facts' from source: unknown 29922 1726853654.83622: _low_level_execute_command(): starting 29922 1726853654.83625: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853654.84364: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853654.84750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853654.84753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853654.84829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853654.87230: stdout chunk (state=3): >>>/root <<< 29922 1726853654.87583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853654.87587: stdout chunk (state=3): >>><<< 29922 1726853654.87589: stderr chunk (state=3): >>><<< 29922 1726853654.87592: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853654.87595: _low_level_execute_command(): starting 29922 1726853654.87597: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853654.8753374-30103-265826281754474 `" && echo ansible-tmp-1726853654.8753374-30103-265826281754474="` echo /root/.ansible/tmp/ansible-tmp-1726853654.8753374-30103-265826281754474 `" ) && sleep 0' 29922 1726853654.88647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853654.88752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853654.88762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853654.88776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853654.88904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853654.89191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853654.89194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853654.91947: stdout chunk (state=3): >>>ansible-tmp-1726853654.8753374-30103-265826281754474=/root/.ansible/tmp/ansible-tmp-1726853654.8753374-30103-265826281754474 <<< 29922 1726853654.92210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853654.92214: stdout chunk (state=3): >>><<< 29922 1726853654.92221: stderr chunk (state=3): >>><<< 29922 1726853654.92713: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853654.8753374-30103-265826281754474=/root/.ansible/tmp/ansible-tmp-1726853654.8753374-30103-265826281754474 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853654.92717: variable 'ansible_module_compression' from source: unknown 29922 1726853654.92720: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 29922 1726853654.92723: variable 'ansible_facts' from source: unknown 29922 1726853654.93074: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853654.8753374-30103-265826281754474/AnsiballZ_setup.py 29922 1726853654.93331: Sending initial data 29922 1726853654.93382: Sent initial data (154 bytes) 29922 1726853654.95077: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853654.95093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853654.95104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853654.95287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853654.95351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853654.95433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853654.97695: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853654.97763: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853654.97821: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpkarywu_o /root/.ansible/tmp/ansible-tmp-1726853654.8753374-30103-265826281754474/AnsiballZ_setup.py <<< 29922 1726853654.97838: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853654.8753374-30103-265826281754474/AnsiballZ_setup.py" <<< 29922 1726853654.97918: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpkarywu_o" to remote "/root/.ansible/tmp/ansible-tmp-1726853654.8753374-30103-265826281754474/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853654.8753374-30103-265826281754474/AnsiballZ_setup.py" <<< 29922 1726853655.00017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853655.00028: stdout chunk (state=3): >>><<< 29922 1726853655.00041: stderr chunk (state=3): >>><<< 29922 1726853655.00067: done transferring module to remote 29922 1726853655.00086: _low_level_execute_command(): starting 29922 1726853655.00096: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853654.8753374-30103-265826281754474/ /root/.ansible/tmp/ansible-tmp-1726853654.8753374-30103-265826281754474/AnsiballZ_setup.py && sleep 0' 29922 1726853655.00718: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853655.00738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853655.00761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853655.00786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853655.00803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853655.00814: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853655.00827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853655.00885: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853655.00924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853655.00942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853655.00962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853655.01100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853655.03926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853655.03961: stdout chunk (state=3): >>><<< 29922 1726853655.03979: stderr chunk (state=3): >>><<< 29922 1726853655.04276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853655.04279: _low_level_execute_command(): starting 29922 1726853655.04282: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853654.8753374-30103-265826281754474/AnsiballZ_setup.py && sleep 0' 29922 1726853655.05105: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853655.05121: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853655.05224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853655.05248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853655.05350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853655.89314: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.47412109375, "5m": 0.484375, "15m": 0.2958984375}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "15", "epoch": "1726853655", "epoch_int": "1726853655", "date": "2024-09-20", "time": "13:34:15", "iso8601_micro": "2024-09-20T17:34:15.478922Z", "iso8601": "2024-09-20T17:34:15Z", "iso8601_basic": "20240920T133415478922", "iso8601_basic_short": "20240920T133415", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RU<<< 29922 1726853655.89358: stdout chunk (state=3): >>>NTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["rpltstbr", "eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocach<<< 29922 1726853655.89401: stdout chunk (state=3): >>>e_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2969, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 562, "free": 2969}, "nocache": {"free": 3287, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 799, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261798584320, "block_size": 4096, "block_total": 65519099, "block_available": 63915670, "block_used": 1603429, "inode_total": 131070960, "inode_available": 131029144, "inode_used": 41816, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 29922 1726853655.92087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853655.92104: stdout chunk (state=3): >>><<< 29922 1726853655.92117: stderr chunk (state=3): >>><<< 29922 1726853655.92164: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.47412109375, "5m": 0.484375, "15m": 0.2958984375}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "15", "epoch": "1726853655", "epoch_int": "1726853655", "date": "2024-09-20", "time": "13:34:15", "iso8601_micro": "2024-09-20T17:34:15.478922Z", "iso8601": "2024-09-20T17:34:15Z", "iso8601_basic": "20240920T133415478922", "iso8601_basic_short": "20240920T133415", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["rpltstbr", "eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2969, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 562, "free": 2969}, "nocache": {"free": 3287, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 799, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261798584320, "block_size": 4096, "block_total": 65519099, "block_available": 63915670, "block_used": 1603429, "inode_total": 131070960, "inode_available": 131029144, "inode_used": 41816, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853655.92587: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853654.8753374-30103-265826281754474/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853655.92590: _low_level_execute_command(): starting 29922 1726853655.92592: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853654.8753374-30103-265826281754474/ > /dev/null 2>&1 && sleep 0' 29922 1726853655.93444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853655.93566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853655.95657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853655.95705: stderr chunk (state=3): >>><<< 29922 1726853655.95710: stdout chunk (state=3): >>><<< 29922 1726853655.95731: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853655.95738: handler run complete 29922 1726853655.95875: variable 'ansible_facts' from source: unknown 29922 1726853655.95954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853655.96302: variable 'ansible_facts' from source: unknown 29922 1726853655.96446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853655.96518: attempt loop complete, returning result 29922 1726853655.96521: _execute() done 29922 1726853655.96524: dumping result to json 29922 1726853655.96562: done dumping result, returning 29922 1726853655.96568: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [02083763-bbaf-51d4-513b-00000000010b] 29922 1726853655.96583: sending task result for task 02083763-bbaf-51d4-513b-00000000010b 29922 1726853655.97560: done sending task result for task 02083763-bbaf-51d4-513b-00000000010b 29922 1726853655.97564: WORKER PROCESS EXITING ok: [managed_node3] 29922 1726853655.98010: no more pending results, returning what we have 29922 1726853655.98013: results queue empty 29922 1726853655.98014: checking for any_errors_fatal 29922 1726853655.98015: done checking for any_errors_fatal 29922 1726853655.98016: checking for max_fail_percentage 29922 1726853655.98061: done checking for max_fail_percentage 29922 1726853655.98063: checking to see if all hosts have failed and the running result is not ok 29922 1726853655.98064: done checking to see if all hosts have failed 29922 1726853655.98064: getting the remaining hosts for this loop 29922 1726853655.98065: done getting the remaining hosts for this loop 29922 1726853655.98069: getting the next task for host managed_node3 29922 1726853655.98077: done getting next task for host managed_node3 29922 1726853655.98079: ^ task is: TASK: meta (flush_handlers) 29922 1726853655.98081: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853655.98085: getting variables 29922 1726853655.98086: in VariableManager get_vars() 29922 1726853655.98113: Calling all_inventory to load vars for managed_node3 29922 1726853655.98116: Calling groups_inventory to load vars for managed_node3 29922 1726853655.98118: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853655.98128: Calling all_plugins_play to load vars for managed_node3 29922 1726853655.98131: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853655.98134: Calling groups_plugins_play to load vars for managed_node3 29922 1726853655.98291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853655.98433: done with get_vars() 29922 1726853655.98444: done getting variables 29922 1726853655.98495: in VariableManager get_vars() 29922 1726853655.98510: Calling all_inventory to load vars for managed_node3 29922 1726853655.98512: Calling groups_inventory to load vars for managed_node3 29922 1726853655.98513: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853655.98517: Calling all_plugins_play to load vars for managed_node3 29922 1726853655.98518: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853655.98520: Calling groups_plugins_play to load vars for managed_node3 29922 1726853655.98609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853655.98739: done with get_vars() 29922 1726853655.98749: done queuing things up, now waiting for results queue to drain 29922 1726853655.98750: results queue empty 29922 1726853655.98750: checking for any_errors_fatal 29922 1726853655.98752: done checking for any_errors_fatal 29922 1726853655.98753: checking for max_fail_percentage 29922 1726853655.98753: done checking for max_fail_percentage 29922 1726853655.98754: checking to see if all hosts have failed and the running result is not ok 29922 1726853655.98754: done checking to see if all hosts have failed 29922 1726853655.98755: getting the remaining hosts for this loop 29922 1726853655.98755: done getting the remaining hosts for this loop 29922 1726853655.98761: getting the next task for host managed_node3 29922 1726853655.98764: done getting next task for host managed_node3 29922 1726853655.98767: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 29922 1726853655.98768: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853655.98770: getting variables 29922 1726853655.98772: in VariableManager get_vars() 29922 1726853655.98780: Calling all_inventory to load vars for managed_node3 29922 1726853655.98781: Calling groups_inventory to load vars for managed_node3 29922 1726853655.98782: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853655.98785: Calling all_plugins_play to load vars for managed_node3 29922 1726853655.98786: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853655.98788: Calling groups_plugins_play to load vars for managed_node3 29922 1726853655.98868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853655.98988: done with get_vars() 29922 1726853655.98994: done getting variables 29922 1726853655.99022: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853655.99125: variable 'type' from source: play vars 29922 1726853655.99129: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:10 Friday 20 September 2024 13:34:15 -0400 (0:00:01.178) 0:00:04.921 ****** 29922 1726853655.99155: entering _queue_task() for managed_node3/set_fact 29922 1726853655.99381: worker is 1 (out of 1 available) 29922 1726853655.99393: exiting _queue_task() for managed_node3/set_fact 29922 1726853655.99405: done queuing things up, now waiting for results queue to drain 29922 1726853655.99407: waiting for pending results... 29922 1726853655.99564: running TaskExecutor() for managed_node3/TASK: Set type=veth and interface=ethtest0 29922 1726853655.99622: in run() - task 02083763-bbaf-51d4-513b-00000000000b 29922 1726853655.99633: variable 'ansible_search_path' from source: unknown 29922 1726853655.99670: calling self._execute() 29922 1726853655.99741: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853655.99745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853655.99755: variable 'omit' from source: magic vars 29922 1726853656.00176: variable 'ansible_distribution_major_version' from source: facts 29922 1726853656.00179: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853656.00181: variable 'omit' from source: magic vars 29922 1726853656.00185: variable 'omit' from source: magic vars 29922 1726853656.00187: variable 'type' from source: play vars 29922 1726853656.00222: variable 'type' from source: play vars 29922 1726853656.00240: variable 'interface' from source: play vars 29922 1726853656.00312: variable 'interface' from source: play vars 29922 1726853656.00407: variable 'omit' from source: magic vars 29922 1726853656.00562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853656.00585: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853656.00610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853656.00647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853656.00666: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853656.00705: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853656.00715: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.00723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.00838: Set connection var ansible_connection to ssh 29922 1726853656.00867: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853656.00886: Set connection var ansible_shell_executable to /bin/sh 29922 1726853656.00987: Set connection var ansible_pipelining to False 29922 1726853656.00990: Set connection var ansible_timeout to 10 29922 1726853656.00993: Set connection var ansible_shell_type to sh 29922 1726853656.00995: variable 'ansible_shell_executable' from source: unknown 29922 1726853656.00997: variable 'ansible_connection' from source: unknown 29922 1726853656.00999: variable 'ansible_module_compression' from source: unknown 29922 1726853656.01001: variable 'ansible_shell_type' from source: unknown 29922 1726853656.01002: variable 'ansible_shell_executable' from source: unknown 29922 1726853656.01004: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.01006: variable 'ansible_pipelining' from source: unknown 29922 1726853656.01175: variable 'ansible_timeout' from source: unknown 29922 1726853656.01178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.01181: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853656.01183: variable 'omit' from source: magic vars 29922 1726853656.01185: starting attempt loop 29922 1726853656.01187: running the handler 29922 1726853656.01210: handler run complete 29922 1726853656.01225: attempt loop complete, returning result 29922 1726853656.01233: _execute() done 29922 1726853656.01238: dumping result to json 29922 1726853656.01244: done dumping result, returning 29922 1726853656.01254: done running TaskExecutor() for managed_node3/TASK: Set type=veth and interface=ethtest0 [02083763-bbaf-51d4-513b-00000000000b] 29922 1726853656.01262: sending task result for task 02083763-bbaf-51d4-513b-00000000000b ok: [managed_node3] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 29922 1726853656.01466: no more pending results, returning what we have 29922 1726853656.01469: results queue empty 29922 1726853656.01472: checking for any_errors_fatal 29922 1726853656.01476: done checking for any_errors_fatal 29922 1726853656.01476: checking for max_fail_percentage 29922 1726853656.01478: done checking for max_fail_percentage 29922 1726853656.01479: checking to see if all hosts have failed and the running result is not ok 29922 1726853656.01480: done checking to see if all hosts have failed 29922 1726853656.01480: getting the remaining hosts for this loop 29922 1726853656.01481: done getting the remaining hosts for this loop 29922 1726853656.01485: getting the next task for host managed_node3 29922 1726853656.01491: done getting next task for host managed_node3 29922 1726853656.01494: ^ task is: TASK: Include the task 'show_interfaces.yml' 29922 1726853656.01496: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853656.01500: getting variables 29922 1726853656.01502: in VariableManager get_vars() 29922 1726853656.01655: Calling all_inventory to load vars for managed_node3 29922 1726853656.01658: Calling groups_inventory to load vars for managed_node3 29922 1726853656.01660: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.01666: done sending task result for task 02083763-bbaf-51d4-513b-00000000000b 29922 1726853656.01669: WORKER PROCESS EXITING 29922 1726853656.01833: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.01838: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.01842: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.02095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.02367: done with get_vars() 29922 1726853656.02419: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:14 Friday 20 September 2024 13:34:16 -0400 (0:00:00.034) 0:00:04.955 ****** 29922 1726853656.02569: entering _queue_task() for managed_node3/include_tasks 29922 1726853656.02907: worker is 1 (out of 1 available) 29922 1726853656.02918: exiting _queue_task() for managed_node3/include_tasks 29922 1726853656.02934: done queuing things up, now waiting for results queue to drain 29922 1726853656.02935: waiting for pending results... 29922 1726853656.03131: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 29922 1726853656.03211: in run() - task 02083763-bbaf-51d4-513b-00000000000c 29922 1726853656.03227: variable 'ansible_search_path' from source: unknown 29922 1726853656.03256: calling self._execute() 29922 1726853656.03327: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.03334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.03345: variable 'omit' from source: magic vars 29922 1726853656.03617: variable 'ansible_distribution_major_version' from source: facts 29922 1726853656.03631: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853656.03639: _execute() done 29922 1726853656.03644: dumping result to json 29922 1726853656.03648: done dumping result, returning 29922 1726853656.03656: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-51d4-513b-00000000000c] 29922 1726853656.03664: sending task result for task 02083763-bbaf-51d4-513b-00000000000c 29922 1726853656.03765: done sending task result for task 02083763-bbaf-51d4-513b-00000000000c 29922 1726853656.03767: WORKER PROCESS EXITING 29922 1726853656.03793: no more pending results, returning what we have 29922 1726853656.03798: in VariableManager get_vars() 29922 1726853656.03840: Calling all_inventory to load vars for managed_node3 29922 1726853656.03842: Calling groups_inventory to load vars for managed_node3 29922 1726853656.03844: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.03854: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.03856: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.03859: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.04000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.04124: done with get_vars() 29922 1726853656.04130: variable 'ansible_search_path' from source: unknown 29922 1726853656.04167: we have included files to process 29922 1726853656.04168: generating all_blocks data 29922 1726853656.04170: done generating all_blocks data 29922 1726853656.04170: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 29922 1726853656.04173: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 29922 1726853656.04175: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 29922 1726853656.04305: in VariableManager get_vars() 29922 1726853656.04318: done with get_vars() 29922 1726853656.04409: done processing included file 29922 1726853656.04411: iterating over new_blocks loaded from include file 29922 1726853656.04412: in VariableManager get_vars() 29922 1726853656.04427: done with get_vars() 29922 1726853656.04429: filtering new block on tags 29922 1726853656.04445: done filtering new block on tags 29922 1726853656.04447: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 29922 1726853656.04451: extending task lists for all hosts with included blocks 29922 1726853656.05646: done extending task lists 29922 1726853656.05648: done processing included files 29922 1726853656.05649: results queue empty 29922 1726853656.05650: checking for any_errors_fatal 29922 1726853656.05653: done checking for any_errors_fatal 29922 1726853656.05654: checking for max_fail_percentage 29922 1726853656.05655: done checking for max_fail_percentage 29922 1726853656.05655: checking to see if all hosts have failed and the running result is not ok 29922 1726853656.05656: done checking to see if all hosts have failed 29922 1726853656.05657: getting the remaining hosts for this loop 29922 1726853656.05658: done getting the remaining hosts for this loop 29922 1726853656.05660: getting the next task for host managed_node3 29922 1726853656.05663: done getting next task for host managed_node3 29922 1726853656.05665: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 29922 1726853656.05667: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853656.05670: getting variables 29922 1726853656.05672: in VariableManager get_vars() 29922 1726853656.05683: Calling all_inventory to load vars for managed_node3 29922 1726853656.05685: Calling groups_inventory to load vars for managed_node3 29922 1726853656.05687: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.05692: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.05695: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.05697: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.06027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.06435: done with get_vars() 29922 1726853656.06443: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:34:16 -0400 (0:00:00.039) 0:00:04.994 ****** 29922 1726853656.06516: entering _queue_task() for managed_node3/include_tasks 29922 1726853656.06999: worker is 1 (out of 1 available) 29922 1726853656.07024: exiting _queue_task() for managed_node3/include_tasks 29922 1726853656.07038: done queuing things up, now waiting for results queue to drain 29922 1726853656.07039: waiting for pending results... 29922 1726853656.07291: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 29922 1726853656.07479: in run() - task 02083763-bbaf-51d4-513b-000000000121 29922 1726853656.07482: variable 'ansible_search_path' from source: unknown 29922 1726853656.07485: variable 'ansible_search_path' from source: unknown 29922 1726853656.07487: calling self._execute() 29922 1726853656.07676: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.07680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.07682: variable 'omit' from source: magic vars 29922 1726853656.07912: variable 'ansible_distribution_major_version' from source: facts 29922 1726853656.07928: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853656.07938: _execute() done 29922 1726853656.07944: dumping result to json 29922 1726853656.07952: done dumping result, returning 29922 1726853656.07963: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-51d4-513b-000000000121] 29922 1726853656.07973: sending task result for task 02083763-bbaf-51d4-513b-000000000121 29922 1726853656.08089: no more pending results, returning what we have 29922 1726853656.08094: in VariableManager get_vars() 29922 1726853656.08137: Calling all_inventory to load vars for managed_node3 29922 1726853656.08140: Calling groups_inventory to load vars for managed_node3 29922 1726853656.08142: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.08156: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.08159: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.08161: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.08467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.08888: done with get_vars() 29922 1726853656.08896: variable 'ansible_search_path' from source: unknown 29922 1726853656.08898: variable 'ansible_search_path' from source: unknown 29922 1726853656.08911: done sending task result for task 02083763-bbaf-51d4-513b-000000000121 29922 1726853656.08914: WORKER PROCESS EXITING 29922 1726853656.08941: we have included files to process 29922 1726853656.08943: generating all_blocks data 29922 1726853656.08944: done generating all_blocks data 29922 1726853656.08946: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 29922 1726853656.08947: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 29922 1726853656.08949: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 29922 1726853656.09452: done processing included file 29922 1726853656.09454: iterating over new_blocks loaded from include file 29922 1726853656.09455: in VariableManager get_vars() 29922 1726853656.09472: done with get_vars() 29922 1726853656.09474: filtering new block on tags 29922 1726853656.09491: done filtering new block on tags 29922 1726853656.09493: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 29922 1726853656.09498: extending task lists for all hosts with included blocks 29922 1726853656.09805: done extending task lists 29922 1726853656.09807: done processing included files 29922 1726853656.09808: results queue empty 29922 1726853656.09808: checking for any_errors_fatal 29922 1726853656.09811: done checking for any_errors_fatal 29922 1726853656.09812: checking for max_fail_percentage 29922 1726853656.09813: done checking for max_fail_percentage 29922 1726853656.09814: checking to see if all hosts have failed and the running result is not ok 29922 1726853656.09815: done checking to see if all hosts have failed 29922 1726853656.09815: getting the remaining hosts for this loop 29922 1726853656.09816: done getting the remaining hosts for this loop 29922 1726853656.09819: getting the next task for host managed_node3 29922 1726853656.09823: done getting next task for host managed_node3 29922 1726853656.09825: ^ task is: TASK: Gather current interface info 29922 1726853656.09827: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853656.09830: getting variables 29922 1726853656.09830: in VariableManager get_vars() 29922 1726853656.09843: Calling all_inventory to load vars for managed_node3 29922 1726853656.09845: Calling groups_inventory to load vars for managed_node3 29922 1726853656.09847: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.09852: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.09854: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.09857: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.10464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.10668: done with get_vars() 29922 1726853656.10678: done getting variables 29922 1726853656.10717: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:34:16 -0400 (0:00:00.042) 0:00:05.037 ****** 29922 1726853656.10745: entering _queue_task() for managed_node3/command 29922 1726853656.11044: worker is 1 (out of 1 available) 29922 1726853656.11057: exiting _queue_task() for managed_node3/command 29922 1726853656.11069: done queuing things up, now waiting for results queue to drain 29922 1726853656.11072: waiting for pending results... 29922 1726853656.11318: running TaskExecutor() for managed_node3/TASK: Gather current interface info 29922 1726853656.11431: in run() - task 02083763-bbaf-51d4-513b-0000000001b0 29922 1726853656.11450: variable 'ansible_search_path' from source: unknown 29922 1726853656.11456: variable 'ansible_search_path' from source: unknown 29922 1726853656.11501: calling self._execute() 29922 1726853656.11582: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.11593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.11609: variable 'omit' from source: magic vars 29922 1726853656.11961: variable 'ansible_distribution_major_version' from source: facts 29922 1726853656.11981: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853656.11994: variable 'omit' from source: magic vars 29922 1726853656.12037: variable 'omit' from source: magic vars 29922 1726853656.12078: variable 'omit' from source: magic vars 29922 1726853656.12125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853656.12163: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853656.12187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853656.12212: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853656.12226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853656.12259: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853656.12268: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.12277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.12377: Set connection var ansible_connection to ssh 29922 1726853656.12389: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853656.12401: Set connection var ansible_shell_executable to /bin/sh 29922 1726853656.12411: Set connection var ansible_pipelining to False 29922 1726853656.12427: Set connection var ansible_timeout to 10 29922 1726853656.12433: Set connection var ansible_shell_type to sh 29922 1726853656.12459: variable 'ansible_shell_executable' from source: unknown 29922 1726853656.12467: variable 'ansible_connection' from source: unknown 29922 1726853656.12476: variable 'ansible_module_compression' from source: unknown 29922 1726853656.12482: variable 'ansible_shell_type' from source: unknown 29922 1726853656.12488: variable 'ansible_shell_executable' from source: unknown 29922 1726853656.12495: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.12502: variable 'ansible_pipelining' from source: unknown 29922 1726853656.12528: variable 'ansible_timeout' from source: unknown 29922 1726853656.12532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.12655: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853656.12746: variable 'omit' from source: magic vars 29922 1726853656.12749: starting attempt loop 29922 1726853656.12752: running the handler 29922 1726853656.12754: _low_level_execute_command(): starting 29922 1726853656.12756: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853656.13408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853656.13487: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853656.13538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853656.13565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853656.13589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853656.13700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 29922 1726853656.16095: stdout chunk (state=3): >>>/root <<< 29922 1726853656.16235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853656.16268: stderr chunk (state=3): >>><<< 29922 1726853656.16281: stdout chunk (state=3): >>><<< 29922 1726853656.16300: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 29922 1726853656.16311: _low_level_execute_command(): starting 29922 1726853656.16317: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853656.1629913-30175-50483765288754 `" && echo ansible-tmp-1726853656.1629913-30175-50483765288754="` echo /root/.ansible/tmp/ansible-tmp-1726853656.1629913-30175-50483765288754 `" ) && sleep 0' 29922 1726853656.16734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853656.16738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853656.16769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853656.16777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853656.16789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853656.16834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853656.16839: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853656.16842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853656.16903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 29922 1726853656.19776: stdout chunk (state=3): >>>ansible-tmp-1726853656.1629913-30175-50483765288754=/root/.ansible/tmp/ansible-tmp-1726853656.1629913-30175-50483765288754 <<< 29922 1726853656.19944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853656.19974: stderr chunk (state=3): >>><<< 29922 1726853656.19977: stdout chunk (state=3): >>><<< 29922 1726853656.19993: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853656.1629913-30175-50483765288754=/root/.ansible/tmp/ansible-tmp-1726853656.1629913-30175-50483765288754 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 29922 1726853656.20022: variable 'ansible_module_compression' from source: unknown 29922 1726853656.20068: ANSIBALLZ: Using generic lock for ansible.legacy.command 29922 1726853656.20073: ANSIBALLZ: Acquiring lock 29922 1726853656.20076: ANSIBALLZ: Lock acquired: 140376041361328 29922 1726853656.20078: ANSIBALLZ: Creating module 29922 1726853656.34079: ANSIBALLZ: Writing module into payload 29922 1726853656.34083: ANSIBALLZ: Writing module 29922 1726853656.34085: ANSIBALLZ: Renaming module 29922 1726853656.34087: ANSIBALLZ: Done creating module 29922 1726853656.34088: variable 'ansible_facts' from source: unknown 29922 1726853656.34104: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853656.1629913-30175-50483765288754/AnsiballZ_command.py 29922 1726853656.34293: Sending initial data 29922 1726853656.34302: Sent initial data (155 bytes) 29922 1726853656.34842: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853656.34857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853656.34882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853656.34903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853656.34919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853656.34933: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853656.34986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853656.35034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853656.35060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853656.35080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853656.35181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853656.36846: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853656.36910: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853656.37006: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp32q28alv /root/.ansible/tmp/ansible-tmp-1726853656.1629913-30175-50483765288754/AnsiballZ_command.py <<< 29922 1726853656.37137: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853656.1629913-30175-50483765288754/AnsiballZ_command.py" <<< 29922 1726853656.37179: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp32q28alv" to remote "/root/.ansible/tmp/ansible-tmp-1726853656.1629913-30175-50483765288754/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853656.1629913-30175-50483765288754/AnsiballZ_command.py" <<< 29922 1726853656.37967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853656.37983: stderr chunk (state=3): >>><<< 29922 1726853656.37987: stdout chunk (state=3): >>><<< 29922 1726853656.38007: done transferring module to remote 29922 1726853656.38016: _low_level_execute_command(): starting 29922 1726853656.38021: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853656.1629913-30175-50483765288754/ /root/.ansible/tmp/ansible-tmp-1726853656.1629913-30175-50483765288754/AnsiballZ_command.py && sleep 0' 29922 1726853656.38444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853656.38448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853656.38459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853656.38520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853656.38527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853656.38601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853656.40757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853656.40833: stderr chunk (state=3): >>><<< 29922 1726853656.40837: stdout chunk (state=3): >>><<< 29922 1726853656.40839: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853656.40842: _low_level_execute_command(): starting 29922 1726853656.40844: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853656.1629913-30175-50483765288754/AnsiballZ_command.py && sleep 0' 29922 1726853656.41250: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853656.41253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853656.41256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853656.41258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853656.41260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853656.41311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853656.41314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853656.41388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853656.57709: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:34:16.572336", "end": "2024-09-20 13:34:16.575898", "delta": "0:00:00.003562", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853656.59319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853656.59347: stderr chunk (state=3): >>><<< 29922 1726853656.59350: stdout chunk (state=3): >>><<< 29922 1726853656.59377: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:34:16.572336", "end": "2024-09-20 13:34:16.575898", "delta": "0:00:00.003562", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853656.59404: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853656.1629913-30175-50483765288754/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853656.59411: _low_level_execute_command(): starting 29922 1726853656.59416: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853656.1629913-30175-50483765288754/ > /dev/null 2>&1 && sleep 0' 29922 1726853656.59878: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853656.59881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853656.59884: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853656.59890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853656.59893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853656.59936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853656.59941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853656.59943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853656.60002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853656.62206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853656.62210: stdout chunk (state=3): >>><<< 29922 1726853656.62212: stderr chunk (state=3): >>><<< 29922 1726853656.62215: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853656.62218: handler run complete 29922 1726853656.62383: Evaluated conditional (False): False 29922 1726853656.62386: attempt loop complete, returning result 29922 1726853656.62388: _execute() done 29922 1726853656.62390: dumping result to json 29922 1726853656.62392: done dumping result, returning 29922 1726853656.62394: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [02083763-bbaf-51d4-513b-0000000001b0] 29922 1726853656.62395: sending task result for task 02083763-bbaf-51d4-513b-0000000001b0 ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003562", "end": "2024-09-20 13:34:16.575898", "rc": 0, "start": "2024-09-20 13:34:16.572336" } STDOUT: bonding_masters eth0 lo rpltstbr 29922 1726853656.62648: no more pending results, returning what we have 29922 1726853656.62652: results queue empty 29922 1726853656.62653: checking for any_errors_fatal 29922 1726853656.62654: done checking for any_errors_fatal 29922 1726853656.62655: checking for max_fail_percentage 29922 1726853656.62657: done checking for max_fail_percentage 29922 1726853656.62659: checking to see if all hosts have failed and the running result is not ok 29922 1726853656.62660: done checking to see if all hosts have failed 29922 1726853656.62661: getting the remaining hosts for this loop 29922 1726853656.62662: done getting the remaining hosts for this loop 29922 1726853656.62666: getting the next task for host managed_node3 29922 1726853656.62679: done getting next task for host managed_node3 29922 1726853656.62681: ^ task is: TASK: Set current_interfaces 29922 1726853656.62685: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853656.62689: getting variables 29922 1726853656.62690: in VariableManager get_vars() 29922 1726853656.62725: Calling all_inventory to load vars for managed_node3 29922 1726853656.62727: Calling groups_inventory to load vars for managed_node3 29922 1726853656.62730: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.62741: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.62743: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.62746: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.63041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.63573: done with get_vars() 29922 1726853656.63581: done getting variables 29922 1726853656.63626: done sending task result for task 02083763-bbaf-51d4-513b-0000000001b0 29922 1726853656.63629: WORKER PROCESS EXITING 29922 1726853656.63642: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:34:16 -0400 (0:00:00.529) 0:00:05.566 ****** 29922 1726853656.63666: entering _queue_task() for managed_node3/set_fact 29922 1726853656.63874: worker is 1 (out of 1 available) 29922 1726853656.63887: exiting _queue_task() for managed_node3/set_fact 29922 1726853656.63900: done queuing things up, now waiting for results queue to drain 29922 1726853656.63901: waiting for pending results... 29922 1726853656.64108: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 29922 1726853656.64191: in run() - task 02083763-bbaf-51d4-513b-0000000001b1 29922 1726853656.64376: variable 'ansible_search_path' from source: unknown 29922 1726853656.64384: variable 'ansible_search_path' from source: unknown 29922 1726853656.64388: calling self._execute() 29922 1726853656.64391: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.64393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.64395: variable 'omit' from source: magic vars 29922 1726853656.64723: variable 'ansible_distribution_major_version' from source: facts 29922 1726853656.64751: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853656.64754: variable 'omit' from source: magic vars 29922 1726853656.64813: variable 'omit' from source: magic vars 29922 1726853656.64976: variable '_current_interfaces' from source: set_fact 29922 1726853656.64980: variable 'omit' from source: magic vars 29922 1726853656.64994: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853656.65068: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853656.65095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853656.65118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853656.65134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853656.65169: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853656.65190: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.65199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.65314: Set connection var ansible_connection to ssh 29922 1726853656.65336: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853656.65348: Set connection var ansible_shell_executable to /bin/sh 29922 1726853656.65359: Set connection var ansible_pipelining to False 29922 1726853656.65369: Set connection var ansible_timeout to 10 29922 1726853656.65378: Set connection var ansible_shell_type to sh 29922 1726853656.65403: variable 'ansible_shell_executable' from source: unknown 29922 1726853656.65411: variable 'ansible_connection' from source: unknown 29922 1726853656.65416: variable 'ansible_module_compression' from source: unknown 29922 1726853656.65422: variable 'ansible_shell_type' from source: unknown 29922 1726853656.65678: variable 'ansible_shell_executable' from source: unknown 29922 1726853656.65680: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.65683: variable 'ansible_pipelining' from source: unknown 29922 1726853656.65685: variable 'ansible_timeout' from source: unknown 29922 1726853656.65687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.65690: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853656.65693: variable 'omit' from source: magic vars 29922 1726853656.65695: starting attempt loop 29922 1726853656.65697: running the handler 29922 1726853656.65699: handler run complete 29922 1726853656.65701: attempt loop complete, returning result 29922 1726853656.65703: _execute() done 29922 1726853656.65705: dumping result to json 29922 1726853656.65707: done dumping result, returning 29922 1726853656.65709: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [02083763-bbaf-51d4-513b-0000000001b1] 29922 1726853656.65711: sending task result for task 02083763-bbaf-51d4-513b-0000000001b1 29922 1726853656.65775: done sending task result for task 02083763-bbaf-51d4-513b-0000000001b1 29922 1726853656.65779: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 29922 1726853656.65848: no more pending results, returning what we have 29922 1726853656.65852: results queue empty 29922 1726853656.65852: checking for any_errors_fatal 29922 1726853656.65869: done checking for any_errors_fatal 29922 1726853656.65870: checking for max_fail_percentage 29922 1726853656.65873: done checking for max_fail_percentage 29922 1726853656.65874: checking to see if all hosts have failed and the running result is not ok 29922 1726853656.65875: done checking to see if all hosts have failed 29922 1726853656.65876: getting the remaining hosts for this loop 29922 1726853656.65877: done getting the remaining hosts for this loop 29922 1726853656.65881: getting the next task for host managed_node3 29922 1726853656.65920: done getting next task for host managed_node3 29922 1726853656.65924: ^ task is: TASK: Show current_interfaces 29922 1726853656.65926: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853656.65931: getting variables 29922 1726853656.65932: in VariableManager get_vars() 29922 1726853656.65975: Calling all_inventory to load vars for managed_node3 29922 1726853656.65978: Calling groups_inventory to load vars for managed_node3 29922 1726853656.65980: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.66003: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.66006: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.66010: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.66167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.66295: done with get_vars() 29922 1726853656.66303: done getting variables 29922 1726853656.66372: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:34:16 -0400 (0:00:00.027) 0:00:05.593 ****** 29922 1726853656.66391: entering _queue_task() for managed_node3/debug 29922 1726853656.66393: Creating lock for debug 29922 1726853656.66590: worker is 1 (out of 1 available) 29922 1726853656.66605: exiting _queue_task() for managed_node3/debug 29922 1726853656.66615: done queuing things up, now waiting for results queue to drain 29922 1726853656.66616: waiting for pending results... 29922 1726853656.66761: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 29922 1726853656.66812: in run() - task 02083763-bbaf-51d4-513b-000000000122 29922 1726853656.66823: variable 'ansible_search_path' from source: unknown 29922 1726853656.66826: variable 'ansible_search_path' from source: unknown 29922 1726853656.66856: calling self._execute() 29922 1726853656.66918: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.66924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.66933: variable 'omit' from source: magic vars 29922 1726853656.67194: variable 'ansible_distribution_major_version' from source: facts 29922 1726853656.67203: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853656.67209: variable 'omit' from source: magic vars 29922 1726853656.67235: variable 'omit' from source: magic vars 29922 1726853656.67303: variable 'current_interfaces' from source: set_fact 29922 1726853656.67323: variable 'omit' from source: magic vars 29922 1726853656.67353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853656.67382: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853656.67397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853656.67410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853656.67420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853656.67443: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853656.67446: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.67449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.67517: Set connection var ansible_connection to ssh 29922 1726853656.67524: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853656.67530: Set connection var ansible_shell_executable to /bin/sh 29922 1726853656.67537: Set connection var ansible_pipelining to False 29922 1726853656.67543: Set connection var ansible_timeout to 10 29922 1726853656.67545: Set connection var ansible_shell_type to sh 29922 1726853656.67561: variable 'ansible_shell_executable' from source: unknown 29922 1726853656.67569: variable 'ansible_connection' from source: unknown 29922 1726853656.67574: variable 'ansible_module_compression' from source: unknown 29922 1726853656.67577: variable 'ansible_shell_type' from source: unknown 29922 1726853656.67579: variable 'ansible_shell_executable' from source: unknown 29922 1726853656.67581: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.67583: variable 'ansible_pipelining' from source: unknown 29922 1726853656.67586: variable 'ansible_timeout' from source: unknown 29922 1726853656.67590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.67776: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853656.67780: variable 'omit' from source: magic vars 29922 1726853656.67782: starting attempt loop 29922 1726853656.67784: running the handler 29922 1726853656.67788: handler run complete 29922 1726853656.67805: attempt loop complete, returning result 29922 1726853656.67854: _execute() done 29922 1726853656.67867: dumping result to json 29922 1726853656.67879: done dumping result, returning 29922 1726853656.67891: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [02083763-bbaf-51d4-513b-000000000122] 29922 1726853656.67901: sending task result for task 02083763-bbaf-51d4-513b-000000000122 29922 1726853656.68113: done sending task result for task 02083763-bbaf-51d4-513b-000000000122 29922 1726853656.68117: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 29922 1726853656.68162: no more pending results, returning what we have 29922 1726853656.68166: results queue empty 29922 1726853656.68167: checking for any_errors_fatal 29922 1726853656.68170: done checking for any_errors_fatal 29922 1726853656.68172: checking for max_fail_percentage 29922 1726853656.68174: done checking for max_fail_percentage 29922 1726853656.68175: checking to see if all hosts have failed and the running result is not ok 29922 1726853656.68175: done checking to see if all hosts have failed 29922 1726853656.68176: getting the remaining hosts for this loop 29922 1726853656.68177: done getting the remaining hosts for this loop 29922 1726853656.68181: getting the next task for host managed_node3 29922 1726853656.68186: done getting next task for host managed_node3 29922 1726853656.68190: ^ task is: TASK: Include the task 'manage_test_interface.yml' 29922 1726853656.68192: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853656.68196: getting variables 29922 1726853656.68197: in VariableManager get_vars() 29922 1726853656.68291: Calling all_inventory to load vars for managed_node3 29922 1726853656.68294: Calling groups_inventory to load vars for managed_node3 29922 1726853656.68296: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.68306: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.68308: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.68312: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.68580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.68710: done with get_vars() 29922 1726853656.68717: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:16 Friday 20 September 2024 13:34:16 -0400 (0:00:00.023) 0:00:05.617 ****** 29922 1726853656.68780: entering _queue_task() for managed_node3/include_tasks 29922 1726853656.68951: worker is 1 (out of 1 available) 29922 1726853656.68968: exiting _queue_task() for managed_node3/include_tasks 29922 1726853656.68980: done queuing things up, now waiting for results queue to drain 29922 1726853656.68981: waiting for pending results... 29922 1726853656.69130: running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' 29922 1726853656.69189: in run() - task 02083763-bbaf-51d4-513b-00000000000d 29922 1726853656.69198: variable 'ansible_search_path' from source: unknown 29922 1726853656.69230: calling self._execute() 29922 1726853656.69297: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.69301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.69312: variable 'omit' from source: magic vars 29922 1726853656.69586: variable 'ansible_distribution_major_version' from source: facts 29922 1726853656.69595: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853656.69601: _execute() done 29922 1726853656.69603: dumping result to json 29922 1726853656.69606: done dumping result, returning 29922 1726853656.69612: done running TaskExecutor() for managed_node3/TASK: Include the task 'manage_test_interface.yml' [02083763-bbaf-51d4-513b-00000000000d] 29922 1726853656.69616: sending task result for task 02083763-bbaf-51d4-513b-00000000000d 29922 1726853656.69725: no more pending results, returning what we have 29922 1726853656.69729: in VariableManager get_vars() 29922 1726853656.69769: Calling all_inventory to load vars for managed_node3 29922 1726853656.69773: Calling groups_inventory to load vars for managed_node3 29922 1726853656.69776: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.69787: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.69789: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.69792: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.69933: done sending task result for task 02083763-bbaf-51d4-513b-00000000000d 29922 1726853656.69936: WORKER PROCESS EXITING 29922 1726853656.69946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.70076: done with get_vars() 29922 1726853656.70083: variable 'ansible_search_path' from source: unknown 29922 1726853656.70092: we have included files to process 29922 1726853656.70092: generating all_blocks data 29922 1726853656.70094: done generating all_blocks data 29922 1726853656.70096: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 29922 1726853656.70097: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 29922 1726853656.70099: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 29922 1726853656.70441: in VariableManager get_vars() 29922 1726853656.70453: done with get_vars() 29922 1726853656.70599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 29922 1726853656.70963: done processing included file 29922 1726853656.70964: iterating over new_blocks loaded from include file 29922 1726853656.70965: in VariableManager get_vars() 29922 1726853656.70976: done with get_vars() 29922 1726853656.70977: filtering new block on tags 29922 1726853656.70995: done filtering new block on tags 29922 1726853656.70997: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node3 29922 1726853656.71000: extending task lists for all hosts with included blocks 29922 1726853656.71706: done extending task lists 29922 1726853656.71707: done processing included files 29922 1726853656.71708: results queue empty 29922 1726853656.71708: checking for any_errors_fatal 29922 1726853656.71710: done checking for any_errors_fatal 29922 1726853656.71710: checking for max_fail_percentage 29922 1726853656.71711: done checking for max_fail_percentage 29922 1726853656.71712: checking to see if all hosts have failed and the running result is not ok 29922 1726853656.71712: done checking to see if all hosts have failed 29922 1726853656.71712: getting the remaining hosts for this loop 29922 1726853656.71713: done getting the remaining hosts for this loop 29922 1726853656.71715: getting the next task for host managed_node3 29922 1726853656.71718: done getting next task for host managed_node3 29922 1726853656.71719: ^ task is: TASK: Ensure state in ["present", "absent"] 29922 1726853656.71721: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853656.71722: getting variables 29922 1726853656.71722: in VariableManager get_vars() 29922 1726853656.71730: Calling all_inventory to load vars for managed_node3 29922 1726853656.71731: Calling groups_inventory to load vars for managed_node3 29922 1726853656.71732: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.71736: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.71737: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.71739: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.71844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.71963: done with get_vars() 29922 1726853656.71969: done getting variables 29922 1726853656.72013: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 13:34:16 -0400 (0:00:00.032) 0:00:05.649 ****** 29922 1726853656.72033: entering _queue_task() for managed_node3/fail 29922 1726853656.72034: Creating lock for fail 29922 1726853656.72252: worker is 1 (out of 1 available) 29922 1726853656.72269: exiting _queue_task() for managed_node3/fail 29922 1726853656.72282: done queuing things up, now waiting for results queue to drain 29922 1726853656.72284: waiting for pending results... 29922 1726853656.72432: running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] 29922 1726853656.72486: in run() - task 02083763-bbaf-51d4-513b-0000000001cc 29922 1726853656.72498: variable 'ansible_search_path' from source: unknown 29922 1726853656.72502: variable 'ansible_search_path' from source: unknown 29922 1726853656.72531: calling self._execute() 29922 1726853656.72597: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.72600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.72610: variable 'omit' from source: magic vars 29922 1726853656.72882: variable 'ansible_distribution_major_version' from source: facts 29922 1726853656.72891: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853656.72979: variable 'state' from source: include params 29922 1726853656.72982: Evaluated conditional (state not in ["present", "absent"]): False 29922 1726853656.72985: when evaluation is False, skipping this task 29922 1726853656.72988: _execute() done 29922 1726853656.72990: dumping result to json 29922 1726853656.72994: done dumping result, returning 29922 1726853656.73000: done running TaskExecutor() for managed_node3/TASK: Ensure state in ["present", "absent"] [02083763-bbaf-51d4-513b-0000000001cc] 29922 1726853656.73005: sending task result for task 02083763-bbaf-51d4-513b-0000000001cc 29922 1726853656.73085: done sending task result for task 02083763-bbaf-51d4-513b-0000000001cc 29922 1726853656.73088: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 29922 1726853656.73132: no more pending results, returning what we have 29922 1726853656.73135: results queue empty 29922 1726853656.73136: checking for any_errors_fatal 29922 1726853656.73138: done checking for any_errors_fatal 29922 1726853656.73139: checking for max_fail_percentage 29922 1726853656.73140: done checking for max_fail_percentage 29922 1726853656.73141: checking to see if all hosts have failed and the running result is not ok 29922 1726853656.73142: done checking to see if all hosts have failed 29922 1726853656.73142: getting the remaining hosts for this loop 29922 1726853656.73144: done getting the remaining hosts for this loop 29922 1726853656.73147: getting the next task for host managed_node3 29922 1726853656.73152: done getting next task for host managed_node3 29922 1726853656.73154: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 29922 1726853656.73156: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853656.73162: getting variables 29922 1726853656.73164: in VariableManager get_vars() 29922 1726853656.73194: Calling all_inventory to load vars for managed_node3 29922 1726853656.73196: Calling groups_inventory to load vars for managed_node3 29922 1726853656.73198: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.73207: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.73209: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.73212: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.73338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.73483: done with get_vars() 29922 1726853656.73490: done getting variables 29922 1726853656.73527: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 13:34:16 -0400 (0:00:00.015) 0:00:05.665 ****** 29922 1726853656.73545: entering _queue_task() for managed_node3/fail 29922 1726853656.73732: worker is 1 (out of 1 available) 29922 1726853656.73745: exiting _queue_task() for managed_node3/fail 29922 1726853656.73756: done queuing things up, now waiting for results queue to drain 29922 1726853656.73760: waiting for pending results... 29922 1726853656.73899: running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] 29922 1726853656.73957: in run() - task 02083763-bbaf-51d4-513b-0000000001cd 29922 1726853656.73969: variable 'ansible_search_path' from source: unknown 29922 1726853656.73976: variable 'ansible_search_path' from source: unknown 29922 1726853656.74003: calling self._execute() 29922 1726853656.74060: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.74064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.74075: variable 'omit' from source: magic vars 29922 1726853656.74329: variable 'ansible_distribution_major_version' from source: facts 29922 1726853656.74338: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853656.74429: variable 'type' from source: set_fact 29922 1726853656.74433: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 29922 1726853656.74437: when evaluation is False, skipping this task 29922 1726853656.74440: _execute() done 29922 1726853656.74442: dumping result to json 29922 1726853656.74445: done dumping result, returning 29922 1726853656.74450: done running TaskExecutor() for managed_node3/TASK: Ensure type in ["dummy", "tap", "veth"] [02083763-bbaf-51d4-513b-0000000001cd] 29922 1726853656.74455: sending task result for task 02083763-bbaf-51d4-513b-0000000001cd 29922 1726853656.74532: done sending task result for task 02083763-bbaf-51d4-513b-0000000001cd 29922 1726853656.74534: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 29922 1726853656.74581: no more pending results, returning what we have 29922 1726853656.74584: results queue empty 29922 1726853656.74584: checking for any_errors_fatal 29922 1726853656.74589: done checking for any_errors_fatal 29922 1726853656.74590: checking for max_fail_percentage 29922 1726853656.74591: done checking for max_fail_percentage 29922 1726853656.74592: checking to see if all hosts have failed and the running result is not ok 29922 1726853656.74592: done checking to see if all hosts have failed 29922 1726853656.74593: getting the remaining hosts for this loop 29922 1726853656.74594: done getting the remaining hosts for this loop 29922 1726853656.74597: getting the next task for host managed_node3 29922 1726853656.74602: done getting next task for host managed_node3 29922 1726853656.74604: ^ task is: TASK: Include the task 'show_interfaces.yml' 29922 1726853656.74607: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853656.74610: getting variables 29922 1726853656.74611: in VariableManager get_vars() 29922 1726853656.74638: Calling all_inventory to load vars for managed_node3 29922 1726853656.74641: Calling groups_inventory to load vars for managed_node3 29922 1726853656.74642: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.74651: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.74654: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.74656: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.74773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.74899: done with get_vars() 29922 1726853656.74906: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 13:34:16 -0400 (0:00:00.014) 0:00:05.679 ****** 29922 1726853656.74965: entering _queue_task() for managed_node3/include_tasks 29922 1726853656.75144: worker is 1 (out of 1 available) 29922 1726853656.75161: exiting _queue_task() for managed_node3/include_tasks 29922 1726853656.75175: done queuing things up, now waiting for results queue to drain 29922 1726853656.75177: waiting for pending results... 29922 1726853656.75309: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 29922 1726853656.75360: in run() - task 02083763-bbaf-51d4-513b-0000000001ce 29922 1726853656.75368: variable 'ansible_search_path' from source: unknown 29922 1726853656.75375: variable 'ansible_search_path' from source: unknown 29922 1726853656.75405: calling self._execute() 29922 1726853656.75460: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.75464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.75472: variable 'omit' from source: magic vars 29922 1726853656.75716: variable 'ansible_distribution_major_version' from source: facts 29922 1726853656.75732: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853656.75736: _execute() done 29922 1726853656.75739: dumping result to json 29922 1726853656.75742: done dumping result, returning 29922 1726853656.75744: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-51d4-513b-0000000001ce] 29922 1726853656.75747: sending task result for task 02083763-bbaf-51d4-513b-0000000001ce 29922 1726853656.75824: done sending task result for task 02083763-bbaf-51d4-513b-0000000001ce 29922 1726853656.75826: WORKER PROCESS EXITING 29922 1726853656.75881: no more pending results, returning what we have 29922 1726853656.75884: in VariableManager get_vars() 29922 1726853656.75915: Calling all_inventory to load vars for managed_node3 29922 1726853656.75917: Calling groups_inventory to load vars for managed_node3 29922 1726853656.75919: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.75927: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.75929: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.75932: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.76080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.76199: done with get_vars() 29922 1726853656.76204: variable 'ansible_search_path' from source: unknown 29922 1726853656.76204: variable 'ansible_search_path' from source: unknown 29922 1726853656.76232: we have included files to process 29922 1726853656.76233: generating all_blocks data 29922 1726853656.76235: done generating all_blocks data 29922 1726853656.76240: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 29922 1726853656.76241: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 29922 1726853656.76243: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 29922 1726853656.76336: in VariableManager get_vars() 29922 1726853656.76356: done with get_vars() 29922 1726853656.76461: done processing included file 29922 1726853656.76463: iterating over new_blocks loaded from include file 29922 1726853656.76465: in VariableManager get_vars() 29922 1726853656.76514: done with get_vars() 29922 1726853656.76516: filtering new block on tags 29922 1726853656.76533: done filtering new block on tags 29922 1726853656.76536: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 29922 1726853656.76541: extending task lists for all hosts with included blocks 29922 1726853656.76951: done extending task lists 29922 1726853656.76953: done processing included files 29922 1726853656.76954: results queue empty 29922 1726853656.76954: checking for any_errors_fatal 29922 1726853656.76957: done checking for any_errors_fatal 29922 1726853656.76958: checking for max_fail_percentage 29922 1726853656.76959: done checking for max_fail_percentage 29922 1726853656.76959: checking to see if all hosts have failed and the running result is not ok 29922 1726853656.76960: done checking to see if all hosts have failed 29922 1726853656.76961: getting the remaining hosts for this loop 29922 1726853656.76962: done getting the remaining hosts for this loop 29922 1726853656.76964: getting the next task for host managed_node3 29922 1726853656.76968: done getting next task for host managed_node3 29922 1726853656.76970: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 29922 1726853656.76975: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853656.76978: getting variables 29922 1726853656.76979: in VariableManager get_vars() 29922 1726853656.76989: Calling all_inventory to load vars for managed_node3 29922 1726853656.76991: Calling groups_inventory to load vars for managed_node3 29922 1726853656.76993: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.77007: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.77010: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.77013: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.77177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.77421: done with get_vars() 29922 1726853656.77430: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:34:16 -0400 (0:00:00.025) 0:00:05.704 ****** 29922 1726853656.77515: entering _queue_task() for managed_node3/include_tasks 29922 1726853656.77988: worker is 1 (out of 1 available) 29922 1726853656.77998: exiting _queue_task() for managed_node3/include_tasks 29922 1726853656.78009: done queuing things up, now waiting for results queue to drain 29922 1726853656.78010: waiting for pending results... 29922 1726853656.78143: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 29922 1726853656.78191: in run() - task 02083763-bbaf-51d4-513b-000000000275 29922 1726853656.78212: variable 'ansible_search_path' from source: unknown 29922 1726853656.78222: variable 'ansible_search_path' from source: unknown 29922 1726853656.78276: calling self._execute() 29922 1726853656.78375: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.78391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.78407: variable 'omit' from source: magic vars 29922 1726853656.78824: variable 'ansible_distribution_major_version' from source: facts 29922 1726853656.78876: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853656.78879: _execute() done 29922 1726853656.78893: dumping result to json 29922 1726853656.78900: done dumping result, returning 29922 1726853656.78903: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-51d4-513b-000000000275] 29922 1726853656.78905: sending task result for task 02083763-bbaf-51d4-513b-000000000275 29922 1726853656.79149: no more pending results, returning what we have 29922 1726853656.79156: in VariableManager get_vars() 29922 1726853656.79205: Calling all_inventory to load vars for managed_node3 29922 1726853656.79208: Calling groups_inventory to load vars for managed_node3 29922 1726853656.79211: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.79232: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.79236: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.79240: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.79613: done sending task result for task 02083763-bbaf-51d4-513b-000000000275 29922 1726853656.79617: WORKER PROCESS EXITING 29922 1726853656.79642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.79882: done with get_vars() 29922 1726853656.79891: variable 'ansible_search_path' from source: unknown 29922 1726853656.79892: variable 'ansible_search_path' from source: unknown 29922 1726853656.79958: we have included files to process 29922 1726853656.79959: generating all_blocks data 29922 1726853656.79961: done generating all_blocks data 29922 1726853656.79962: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 29922 1726853656.79963: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 29922 1726853656.79966: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 29922 1726853656.80268: done processing included file 29922 1726853656.80273: iterating over new_blocks loaded from include file 29922 1726853656.80275: in VariableManager get_vars() 29922 1726853656.80292: done with get_vars() 29922 1726853656.80294: filtering new block on tags 29922 1726853656.80318: done filtering new block on tags 29922 1726853656.80321: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 29922 1726853656.80326: extending task lists for all hosts with included blocks 29922 1726853656.80502: done extending task lists 29922 1726853656.80503: done processing included files 29922 1726853656.80504: results queue empty 29922 1726853656.80504: checking for any_errors_fatal 29922 1726853656.80508: done checking for any_errors_fatal 29922 1726853656.80509: checking for max_fail_percentage 29922 1726853656.80510: done checking for max_fail_percentage 29922 1726853656.80510: checking to see if all hosts have failed and the running result is not ok 29922 1726853656.80511: done checking to see if all hosts have failed 29922 1726853656.80512: getting the remaining hosts for this loop 29922 1726853656.80513: done getting the remaining hosts for this loop 29922 1726853656.80515: getting the next task for host managed_node3 29922 1726853656.80519: done getting next task for host managed_node3 29922 1726853656.80521: ^ task is: TASK: Gather current interface info 29922 1726853656.80529: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853656.80532: getting variables 29922 1726853656.80533: in VariableManager get_vars() 29922 1726853656.80543: Calling all_inventory to load vars for managed_node3 29922 1726853656.80545: Calling groups_inventory to load vars for managed_node3 29922 1726853656.80547: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853656.80552: Calling all_plugins_play to load vars for managed_node3 29922 1726853656.80555: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853656.80557: Calling groups_plugins_play to load vars for managed_node3 29922 1726853656.80724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853656.80846: done with get_vars() 29922 1726853656.80852: done getting variables 29922 1726853656.80883: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:34:16 -0400 (0:00:00.033) 0:00:05.738 ****** 29922 1726853656.80905: entering _queue_task() for managed_node3/command 29922 1726853656.81118: worker is 1 (out of 1 available) 29922 1726853656.81131: exiting _queue_task() for managed_node3/command 29922 1726853656.81144: done queuing things up, now waiting for results queue to drain 29922 1726853656.81145: waiting for pending results... 29922 1726853656.81301: running TaskExecutor() for managed_node3/TASK: Gather current interface info 29922 1726853656.81369: in run() - task 02083763-bbaf-51d4-513b-0000000002ac 29922 1726853656.81382: variable 'ansible_search_path' from source: unknown 29922 1726853656.81385: variable 'ansible_search_path' from source: unknown 29922 1726853656.81413: calling self._execute() 29922 1726853656.81476: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.81483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.81494: variable 'omit' from source: magic vars 29922 1726853656.81756: variable 'ansible_distribution_major_version' from source: facts 29922 1726853656.81768: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853656.81776: variable 'omit' from source: magic vars 29922 1726853656.81808: variable 'omit' from source: magic vars 29922 1726853656.81836: variable 'omit' from source: magic vars 29922 1726853656.81868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853656.81897: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853656.81913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853656.81930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853656.81937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853656.81962: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853656.81966: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.81969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.82039: Set connection var ansible_connection to ssh 29922 1726853656.82045: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853656.82052: Set connection var ansible_shell_executable to /bin/sh 29922 1726853656.82059: Set connection var ansible_pipelining to False 29922 1726853656.82066: Set connection var ansible_timeout to 10 29922 1726853656.82069: Set connection var ansible_shell_type to sh 29922 1726853656.82088: variable 'ansible_shell_executable' from source: unknown 29922 1726853656.82090: variable 'ansible_connection' from source: unknown 29922 1726853656.82093: variable 'ansible_module_compression' from source: unknown 29922 1726853656.82096: variable 'ansible_shell_type' from source: unknown 29922 1726853656.82099: variable 'ansible_shell_executable' from source: unknown 29922 1726853656.82101: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853656.82105: variable 'ansible_pipelining' from source: unknown 29922 1726853656.82107: variable 'ansible_timeout' from source: unknown 29922 1726853656.82111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853656.82212: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853656.82221: variable 'omit' from source: magic vars 29922 1726853656.82226: starting attempt loop 29922 1726853656.82229: running the handler 29922 1726853656.82243: _low_level_execute_command(): starting 29922 1726853656.82250: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853656.82760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853656.82764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853656.82767: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853656.82769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853656.82813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853656.82816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853656.82904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853656.84603: stdout chunk (state=3): >>>/root <<< 29922 1726853656.84705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853656.84734: stderr chunk (state=3): >>><<< 29922 1726853656.84739: stdout chunk (state=3): >>><<< 29922 1726853656.84759: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853656.84774: _low_level_execute_command(): starting 29922 1726853656.84782: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853656.847612-30227-207230349935566 `" && echo ansible-tmp-1726853656.847612-30227-207230349935566="` echo /root/.ansible/tmp/ansible-tmp-1726853656.847612-30227-207230349935566 `" ) && sleep 0' 29922 1726853656.85216: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853656.85219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853656.85221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29922 1726853656.85231: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853656.85233: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853656.85281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853656.85289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853656.85292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853656.85347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853656.87266: stdout chunk (state=3): >>>ansible-tmp-1726853656.847612-30227-207230349935566=/root/.ansible/tmp/ansible-tmp-1726853656.847612-30227-207230349935566 <<< 29922 1726853656.87374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853656.87399: stderr chunk (state=3): >>><<< 29922 1726853656.87402: stdout chunk (state=3): >>><<< 29922 1726853656.87415: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853656.847612-30227-207230349935566=/root/.ansible/tmp/ansible-tmp-1726853656.847612-30227-207230349935566 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853656.87442: variable 'ansible_module_compression' from source: unknown 29922 1726853656.87485: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29922 1726853656.87514: variable 'ansible_facts' from source: unknown 29922 1726853656.87574: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853656.847612-30227-207230349935566/AnsiballZ_command.py 29922 1726853656.87673: Sending initial data 29922 1726853656.87676: Sent initial data (155 bytes) 29922 1726853656.88180: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853656.88194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853656.88210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853656.88375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853656.89889: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 29922 1726853656.89905: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 29922 1726853656.89916: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 29922 1726853656.89925: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 29922 1726853656.89943: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853656.90023: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853656.90109: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpe7muthf0 /root/.ansible/tmp/ansible-tmp-1726853656.847612-30227-207230349935566/AnsiballZ_command.py <<< 29922 1726853656.90122: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853656.847612-30227-207230349935566/AnsiballZ_command.py" <<< 29922 1726853656.90168: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpe7muthf0" to remote "/root/.ansible/tmp/ansible-tmp-1726853656.847612-30227-207230349935566/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853656.847612-30227-207230349935566/AnsiballZ_command.py" <<< 29922 1726853656.90986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853656.91197: stderr chunk (state=3): >>><<< 29922 1726853656.91201: stdout chunk (state=3): >>><<< 29922 1726853656.91203: done transferring module to remote 29922 1726853656.91205: _low_level_execute_command(): starting 29922 1726853656.91207: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853656.847612-30227-207230349935566/ /root/.ansible/tmp/ansible-tmp-1726853656.847612-30227-207230349935566/AnsiballZ_command.py && sleep 0' 29922 1726853656.91731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853656.91746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853656.91768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853656.91789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853656.91882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853656.91901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853656.91921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853656.92007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853656.93941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853656.93962: stdout chunk (state=3): >>><<< 29922 1726853656.93979: stderr chunk (state=3): >>><<< 29922 1726853656.94002: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853656.94012: _low_level_execute_command(): starting 29922 1726853656.94105: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853656.847612-30227-207230349935566/AnsiballZ_command.py && sleep 0' 29922 1726853656.94670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853656.94688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853656.94723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853656.94804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853656.94827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853656.94922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853658.11194: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:34:17.104124", "end": "2024-09-20 13:34:18.108608", "delta": "0:00:01.004484", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853658.12650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853658.12687: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 29922 1726853658.12741: stderr chunk (state=3): >>><<< 29922 1726853658.13128: stdout chunk (state=3): >>><<< 29922 1726853658.13132: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:34:17.104124", "end": "2024-09-20 13:34:18.108608", "delta": "0:00:01.004484", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853658.13138: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853656.847612-30227-207230349935566/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853658.13142: _low_level_execute_command(): starting 29922 1726853658.13145: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853656.847612-30227-207230349935566/ > /dev/null 2>&1 && sleep 0' 29922 1726853658.14228: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853658.14364: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853658.14416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853658.14575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853658.14579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853658.14581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853658.16553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853658.16560: stdout chunk (state=3): >>><<< 29922 1726853658.16563: stderr chunk (state=3): >>><<< 29922 1726853658.16777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853658.16781: handler run complete 29922 1726853658.16784: Evaluated conditional (False): False 29922 1726853658.16786: attempt loop complete, returning result 29922 1726853658.16788: _execute() done 29922 1726853658.16789: dumping result to json 29922 1726853658.16791: done dumping result, returning 29922 1726853658.16793: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [02083763-bbaf-51d4-513b-0000000002ac] 29922 1726853658.16795: sending task result for task 02083763-bbaf-51d4-513b-0000000002ac 29922 1726853658.16875: done sending task result for task 02083763-bbaf-51d4-513b-0000000002ac ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:01.004484", "end": "2024-09-20 13:34:18.108608", "rc": 0, "start": "2024-09-20 13:34:17.104124" } STDOUT: bonding_masters eth0 lo rpltstbr 29922 1726853658.16947: no more pending results, returning what we have 29922 1726853658.16951: results queue empty 29922 1726853658.16951: checking for any_errors_fatal 29922 1726853658.16953: done checking for any_errors_fatal 29922 1726853658.16953: checking for max_fail_percentage 29922 1726853658.16955: done checking for max_fail_percentage 29922 1726853658.16955: checking to see if all hosts have failed and the running result is not ok 29922 1726853658.16956: done checking to see if all hosts have failed 29922 1726853658.16957: getting the remaining hosts for this loop 29922 1726853658.16959: done getting the remaining hosts for this loop 29922 1726853658.16962: getting the next task for host managed_node3 29922 1726853658.16968: done getting next task for host managed_node3 29922 1726853658.16970: ^ task is: TASK: Set current_interfaces 29922 1726853658.16978: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853658.16984: getting variables 29922 1726853658.16985: in VariableManager get_vars() 29922 1726853658.17023: Calling all_inventory to load vars for managed_node3 29922 1726853658.17026: Calling groups_inventory to load vars for managed_node3 29922 1726853658.17029: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853658.17043: Calling all_plugins_play to load vars for managed_node3 29922 1726853658.17633: WORKER PROCESS EXITING 29922 1726853658.17639: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853658.17645: Calling groups_plugins_play to load vars for managed_node3 29922 1726853658.17830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853658.19291: done with get_vars() 29922 1726853658.19305: done getting variables 29922 1726853658.19366: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:34:18 -0400 (0:00:01.384) 0:00:07.123 ****** 29922 1726853658.19402: entering _queue_task() for managed_node3/set_fact 29922 1726853658.20327: worker is 1 (out of 1 available) 29922 1726853658.20342: exiting _queue_task() for managed_node3/set_fact 29922 1726853658.20354: done queuing things up, now waiting for results queue to drain 29922 1726853658.20355: waiting for pending results... 29922 1726853658.21091: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 29922 1726853658.21104: in run() - task 02083763-bbaf-51d4-513b-0000000002ad 29922 1726853658.21107: variable 'ansible_search_path' from source: unknown 29922 1726853658.21110: variable 'ansible_search_path' from source: unknown 29922 1726853658.21477: calling self._execute() 29922 1726853658.21481: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853658.21484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853658.21487: variable 'omit' from source: magic vars 29922 1726853658.22155: variable 'ansible_distribution_major_version' from source: facts 29922 1726853658.22178: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853658.22190: variable 'omit' from source: magic vars 29922 1726853658.22247: variable 'omit' from source: magic vars 29922 1726853658.22481: variable '_current_interfaces' from source: set_fact 29922 1726853658.22546: variable 'omit' from source: magic vars 29922 1726853658.22715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853658.22755: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853658.23076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853658.23079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853658.23275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853658.23279: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853658.23281: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853658.23283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853658.23285: Set connection var ansible_connection to ssh 29922 1726853658.23287: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853658.23289: Set connection var ansible_shell_executable to /bin/sh 29922 1726853658.23291: Set connection var ansible_pipelining to False 29922 1726853658.23293: Set connection var ansible_timeout to 10 29922 1726853658.23295: Set connection var ansible_shell_type to sh 29922 1726853658.23296: variable 'ansible_shell_executable' from source: unknown 29922 1726853658.23299: variable 'ansible_connection' from source: unknown 29922 1726853658.23302: variable 'ansible_module_compression' from source: unknown 29922 1726853658.23304: variable 'ansible_shell_type' from source: unknown 29922 1726853658.23307: variable 'ansible_shell_executable' from source: unknown 29922 1726853658.23309: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853658.23312: variable 'ansible_pipelining' from source: unknown 29922 1726853658.23315: variable 'ansible_timeout' from source: unknown 29922 1726853658.23317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853658.23575: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853658.23876: variable 'omit' from source: magic vars 29922 1726853658.23879: starting attempt loop 29922 1726853658.23882: running the handler 29922 1726853658.23885: handler run complete 29922 1726853658.23888: attempt loop complete, returning result 29922 1726853658.23892: _execute() done 29922 1726853658.23894: dumping result to json 29922 1726853658.23897: done dumping result, returning 29922 1726853658.23904: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [02083763-bbaf-51d4-513b-0000000002ad] 29922 1726853658.23906: sending task result for task 02083763-bbaf-51d4-513b-0000000002ad 29922 1726853658.23974: done sending task result for task 02083763-bbaf-51d4-513b-0000000002ad 29922 1726853658.23977: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 29922 1726853658.24063: no more pending results, returning what we have 29922 1726853658.24066: results queue empty 29922 1726853658.24067: checking for any_errors_fatal 29922 1726853658.24080: done checking for any_errors_fatal 29922 1726853658.24080: checking for max_fail_percentage 29922 1726853658.24082: done checking for max_fail_percentage 29922 1726853658.24083: checking to see if all hosts have failed and the running result is not ok 29922 1726853658.24083: done checking to see if all hosts have failed 29922 1726853658.24084: getting the remaining hosts for this loop 29922 1726853658.24085: done getting the remaining hosts for this loop 29922 1726853658.24089: getting the next task for host managed_node3 29922 1726853658.24096: done getting next task for host managed_node3 29922 1726853658.24100: ^ task is: TASK: Show current_interfaces 29922 1726853658.24105: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853658.24109: getting variables 29922 1726853658.24110: in VariableManager get_vars() 29922 1726853658.24146: Calling all_inventory to load vars for managed_node3 29922 1726853658.24148: Calling groups_inventory to load vars for managed_node3 29922 1726853658.24151: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853658.24161: Calling all_plugins_play to load vars for managed_node3 29922 1726853658.24164: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853658.24166: Calling groups_plugins_play to load vars for managed_node3 29922 1726853658.24355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853658.25388: done with get_vars() 29922 1726853658.25400: done getting variables 29922 1726853658.25458: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:34:18 -0400 (0:00:00.060) 0:00:07.184 ****** 29922 1726853658.25491: entering _queue_task() for managed_node3/debug 29922 1726853658.26612: worker is 1 (out of 1 available) 29922 1726853658.26621: exiting _queue_task() for managed_node3/debug 29922 1726853658.26635: done queuing things up, now waiting for results queue to drain 29922 1726853658.26636: waiting for pending results... 29922 1726853658.26674: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 29922 1726853658.27277: in run() - task 02083763-bbaf-51d4-513b-000000000276 29922 1726853658.27282: variable 'ansible_search_path' from source: unknown 29922 1726853658.27285: variable 'ansible_search_path' from source: unknown 29922 1726853658.27288: calling self._execute() 29922 1726853658.27290: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853658.27294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853658.27298: variable 'omit' from source: magic vars 29922 1726853658.28002: variable 'ansible_distribution_major_version' from source: facts 29922 1726853658.28276: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853658.28280: variable 'omit' from source: magic vars 29922 1726853658.28282: variable 'omit' from source: magic vars 29922 1726853658.28348: variable 'current_interfaces' from source: set_fact 29922 1726853658.28676: variable 'omit' from source: magic vars 29922 1726853658.28680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853658.28694: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853658.28717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853658.28737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853658.28752: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853658.28792: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853658.28801: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853658.28809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853658.29113: Set connection var ansible_connection to ssh 29922 1726853658.29126: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853658.29138: Set connection var ansible_shell_executable to /bin/sh 29922 1726853658.29149: Set connection var ansible_pipelining to False 29922 1726853658.29160: Set connection var ansible_timeout to 10 29922 1726853658.29167: Set connection var ansible_shell_type to sh 29922 1726853658.29196: variable 'ansible_shell_executable' from source: unknown 29922 1726853658.29204: variable 'ansible_connection' from source: unknown 29922 1726853658.29211: variable 'ansible_module_compression' from source: unknown 29922 1726853658.29218: variable 'ansible_shell_type' from source: unknown 29922 1726853658.29224: variable 'ansible_shell_executable' from source: unknown 29922 1726853658.29231: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853658.29239: variable 'ansible_pipelining' from source: unknown 29922 1726853658.29247: variable 'ansible_timeout' from source: unknown 29922 1726853658.29255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853658.29602: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853658.29618: variable 'omit' from source: magic vars 29922 1726853658.29629: starting attempt loop 29922 1726853658.29636: running the handler 29922 1726853658.29689: handler run complete 29922 1726853658.29706: attempt loop complete, returning result 29922 1726853658.29975: _execute() done 29922 1726853658.29979: dumping result to json 29922 1726853658.29981: done dumping result, returning 29922 1726853658.29985: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [02083763-bbaf-51d4-513b-000000000276] 29922 1726853658.29988: sending task result for task 02083763-bbaf-51d4-513b-000000000276 29922 1726853658.30060: done sending task result for task 02083763-bbaf-51d4-513b-000000000276 29922 1726853658.30065: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 29922 1726853658.30116: no more pending results, returning what we have 29922 1726853658.30120: results queue empty 29922 1726853658.30121: checking for any_errors_fatal 29922 1726853658.30125: done checking for any_errors_fatal 29922 1726853658.30125: checking for max_fail_percentage 29922 1726853658.30127: done checking for max_fail_percentage 29922 1726853658.30128: checking to see if all hosts have failed and the running result is not ok 29922 1726853658.30129: done checking to see if all hosts have failed 29922 1726853658.30129: getting the remaining hosts for this loop 29922 1726853658.30131: done getting the remaining hosts for this loop 29922 1726853658.30134: getting the next task for host managed_node3 29922 1726853658.30142: done getting next task for host managed_node3 29922 1726853658.30144: ^ task is: TASK: Install iproute 29922 1726853658.30147: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853658.30152: getting variables 29922 1726853658.30153: in VariableManager get_vars() 29922 1726853658.30192: Calling all_inventory to load vars for managed_node3 29922 1726853658.30194: Calling groups_inventory to load vars for managed_node3 29922 1726853658.30196: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853658.30207: Calling all_plugins_play to load vars for managed_node3 29922 1726853658.30209: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853658.30212: Calling groups_plugins_play to load vars for managed_node3 29922 1726853658.30638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853658.31257: done with get_vars() 29922 1726853658.31268: done getting variables 29922 1726853658.31728: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 13:34:18 -0400 (0:00:00.062) 0:00:07.247 ****** 29922 1726853658.31758: entering _queue_task() for managed_node3/package 29922 1726853658.32456: worker is 1 (out of 1 available) 29922 1726853658.32469: exiting _queue_task() for managed_node3/package 29922 1726853658.32884: done queuing things up, now waiting for results queue to drain 29922 1726853658.32886: waiting for pending results... 29922 1726853658.32929: running TaskExecutor() for managed_node3/TASK: Install iproute 29922 1726853658.33278: in run() - task 02083763-bbaf-51d4-513b-0000000001cf 29922 1726853658.33281: variable 'ansible_search_path' from source: unknown 29922 1726853658.33284: variable 'ansible_search_path' from source: unknown 29922 1726853658.33301: calling self._execute() 29922 1726853658.33391: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853658.33676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853658.33679: variable 'omit' from source: magic vars 29922 1726853658.34176: variable 'ansible_distribution_major_version' from source: facts 29922 1726853658.34476: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853658.34479: variable 'omit' from source: magic vars 29922 1726853658.34481: variable 'omit' from source: magic vars 29922 1726853658.34633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853658.39457: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853658.39779: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853658.39809: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853658.39848: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853658.39891: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853658.40074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853658.40207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853658.40238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853658.40281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853658.40576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853658.40603: variable '__network_is_ostree' from source: set_fact 29922 1726853658.40612: variable 'omit' from source: magic vars 29922 1726853658.40653: variable 'omit' from source: magic vars 29922 1726853658.40976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853658.40979: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853658.40982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853658.40985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853658.40987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853658.41013: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853658.41021: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853658.41030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853658.41135: Set connection var ansible_connection to ssh 29922 1726853658.41289: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853658.41303: Set connection var ansible_shell_executable to /bin/sh 29922 1726853658.41315: Set connection var ansible_pipelining to False 29922 1726853658.41325: Set connection var ansible_timeout to 10 29922 1726853658.41331: Set connection var ansible_shell_type to sh 29922 1726853658.41364: variable 'ansible_shell_executable' from source: unknown 29922 1726853658.41374: variable 'ansible_connection' from source: unknown 29922 1726853658.41386: variable 'ansible_module_compression' from source: unknown 29922 1726853658.41394: variable 'ansible_shell_type' from source: unknown 29922 1726853658.41401: variable 'ansible_shell_executable' from source: unknown 29922 1726853658.41407: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853658.41414: variable 'ansible_pipelining' from source: unknown 29922 1726853658.41482: variable 'ansible_timeout' from source: unknown 29922 1726853658.41496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853658.41747: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853658.41767: variable 'omit' from source: magic vars 29922 1726853658.41779: starting attempt loop 29922 1726853658.41786: running the handler 29922 1726853658.41797: variable 'ansible_facts' from source: unknown 29922 1726853658.41805: variable 'ansible_facts' from source: unknown 29922 1726853658.41849: _low_level_execute_command(): starting 29922 1726853658.42039: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853658.43469: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853658.43604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853658.43700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853658.45887: stdout chunk (state=3): >>>/root <<< 29922 1726853658.45904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853658.45926: stderr chunk (state=3): >>><<< 29922 1726853658.45969: stdout chunk (state=3): >>><<< 29922 1726853658.46174: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853658.46186: _low_level_execute_command(): starting 29922 1726853658.46189: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853658.4608412-30313-136540089181343 `" && echo ansible-tmp-1726853658.4608412-30313-136540089181343="` echo /root/.ansible/tmp/ansible-tmp-1726853658.4608412-30313-136540089181343 `" ) && sleep 0' 29922 1726853658.47380: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853658.47592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853658.47606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853658.47869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853658.49862: stdout chunk (state=3): >>>ansible-tmp-1726853658.4608412-30313-136540089181343=/root/.ansible/tmp/ansible-tmp-1726853658.4608412-30313-136540089181343 <<< 29922 1726853658.49960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853658.50084: stderr chunk (state=3): >>><<< 29922 1726853658.50101: stdout chunk (state=3): >>><<< 29922 1726853658.50128: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853658.4608412-30313-136540089181343=/root/.ansible/tmp/ansible-tmp-1726853658.4608412-30313-136540089181343 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853658.50170: variable 'ansible_module_compression' from source: unknown 29922 1726853658.50480: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 29922 1726853658.50484: ANSIBALLZ: Acquiring lock 29922 1726853658.50486: ANSIBALLZ: Lock acquired: 140376041361328 29922 1726853658.50488: ANSIBALLZ: Creating module 29922 1726853658.84703: ANSIBALLZ: Writing module into payload 29922 1726853658.85025: ANSIBALLZ: Writing module 29922 1726853658.85167: ANSIBALLZ: Renaming module 29922 1726853658.85188: ANSIBALLZ: Done creating module 29922 1726853658.85213: variable 'ansible_facts' from source: unknown 29922 1726853658.85428: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853658.4608412-30313-136540089181343/AnsiballZ_dnf.py 29922 1726853658.85680: Sending initial data 29922 1726853658.85904: Sent initial data (152 bytes) 29922 1726853658.87055: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853658.87076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853658.87095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853658.87277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853658.88885: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853658.88932: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853658.89006: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpg_hehbzm /root/.ansible/tmp/ansible-tmp-1726853658.4608412-30313-136540089181343/AnsiballZ_dnf.py <<< 29922 1726853658.89009: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853658.4608412-30313-136540089181343/AnsiballZ_dnf.py" <<< 29922 1726853658.89299: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpg_hehbzm" to remote "/root/.ansible/tmp/ansible-tmp-1726853658.4608412-30313-136540089181343/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853658.4608412-30313-136540089181343/AnsiballZ_dnf.py" <<< 29922 1726853658.90695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853658.90699: stdout chunk (state=3): >>><<< 29922 1726853658.90705: stderr chunk (state=3): >>><<< 29922 1726853658.90745: done transferring module to remote 29922 1726853658.90756: _low_level_execute_command(): starting 29922 1726853658.90761: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853658.4608412-30313-136540089181343/ /root/.ansible/tmp/ansible-tmp-1726853658.4608412-30313-136540089181343/AnsiballZ_dnf.py && sleep 0' 29922 1726853658.92075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853658.92120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853658.92123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853658.92126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853658.92143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853658.92215: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853658.92219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853658.92458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853658.92462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853658.92465: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853658.92529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853658.94398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853658.94445: stderr chunk (state=3): >>><<< 29922 1726853658.94542: stdout chunk (state=3): >>><<< 29922 1726853658.94565: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853658.94583: _low_level_execute_command(): starting 29922 1726853658.94756: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853658.4608412-30313-136540089181343/AnsiballZ_dnf.py && sleep 0' 29922 1726853658.95843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853658.96088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853658.96393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853658.96435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853658.96529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853659.38768: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 29922 1726853659.43291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853659.43340: stderr chunk (state=3): >>><<< 29922 1726853659.43343: stdout chunk (state=3): >>><<< 29922 1726853659.43379: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853659.43463: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853658.4608412-30313-136540089181343/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853659.43474: _low_level_execute_command(): starting 29922 1726853659.43477: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853658.4608412-30313-136540089181343/ > /dev/null 2>&1 && sleep 0' 29922 1726853659.44527: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853659.44550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853659.44579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853659.44682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853659.44713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853659.44727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853659.44749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853659.44911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853659.46812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853659.46882: stderr chunk (state=3): >>><<< 29922 1726853659.46885: stdout chunk (state=3): >>><<< 29922 1726853659.46899: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853659.46920: handler run complete 29922 1726853659.47108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853659.47361: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853659.47407: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853659.47541: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853659.47623: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853659.47848: variable '__install_status' from source: unknown 29922 1726853659.47906: Evaluated conditional (__install_status is success): True 29922 1726853659.47936: attempt loop complete, returning result 29922 1726853659.48048: _execute() done 29922 1726853659.48050: dumping result to json 29922 1726853659.48053: done dumping result, returning 29922 1726853659.48055: done running TaskExecutor() for managed_node3/TASK: Install iproute [02083763-bbaf-51d4-513b-0000000001cf] 29922 1726853659.48057: sending task result for task 02083763-bbaf-51d4-513b-0000000001cf 29922 1726853659.48476: done sending task result for task 02083763-bbaf-51d4-513b-0000000001cf 29922 1726853659.48480: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 29922 1726853659.48576: no more pending results, returning what we have 29922 1726853659.48580: results queue empty 29922 1726853659.48581: checking for any_errors_fatal 29922 1726853659.48585: done checking for any_errors_fatal 29922 1726853659.48586: checking for max_fail_percentage 29922 1726853659.48587: done checking for max_fail_percentage 29922 1726853659.48588: checking to see if all hosts have failed and the running result is not ok 29922 1726853659.48590: done checking to see if all hosts have failed 29922 1726853659.48591: getting the remaining hosts for this loop 29922 1726853659.48593: done getting the remaining hosts for this loop 29922 1726853659.48596: getting the next task for host managed_node3 29922 1726853659.48603: done getting next task for host managed_node3 29922 1726853659.48606: ^ task is: TASK: Create veth interface {{ interface }} 29922 1726853659.48608: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853659.48612: getting variables 29922 1726853659.48614: in VariableManager get_vars() 29922 1726853659.48653: Calling all_inventory to load vars for managed_node3 29922 1726853659.48656: Calling groups_inventory to load vars for managed_node3 29922 1726853659.48661: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853659.48879: Calling all_plugins_play to load vars for managed_node3 29922 1726853659.48883: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853659.48887: Calling groups_plugins_play to load vars for managed_node3 29922 1726853659.49280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853659.49477: done with get_vars() 29922 1726853659.49485: done getting variables 29922 1726853659.49525: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853659.49619: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 13:34:19 -0400 (0:00:01.178) 0:00:08.426 ****** 29922 1726853659.49647: entering _queue_task() for managed_node3/command 29922 1726853659.49911: worker is 1 (out of 1 available) 29922 1726853659.49925: exiting _queue_task() for managed_node3/command 29922 1726853659.49938: done queuing things up, now waiting for results queue to drain 29922 1726853659.49939: waiting for pending results... 29922 1726853659.50130: running TaskExecutor() for managed_node3/TASK: Create veth interface ethtest0 29922 1726853659.50299: in run() - task 02083763-bbaf-51d4-513b-0000000001d0 29922 1726853659.50303: variable 'ansible_search_path' from source: unknown 29922 1726853659.50306: variable 'ansible_search_path' from source: unknown 29922 1726853659.50524: variable 'interface' from source: set_fact 29922 1726853659.50777: variable 'interface' from source: set_fact 29922 1726853659.50780: variable 'interface' from source: set_fact 29922 1726853659.50844: Loaded config def from plugin (lookup/items) 29922 1726853659.50852: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 29922 1726853659.50884: variable 'omit' from source: magic vars 29922 1726853659.51006: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853659.51016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853659.51028: variable 'omit' from source: magic vars 29922 1726853659.51591: variable 'ansible_distribution_major_version' from source: facts 29922 1726853659.51606: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853659.51800: variable 'type' from source: set_fact 29922 1726853659.51809: variable 'state' from source: include params 29922 1726853659.51818: variable 'interface' from source: set_fact 29922 1726853659.51826: variable 'current_interfaces' from source: set_fact 29922 1726853659.51837: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 29922 1726853659.51847: variable 'omit' from source: magic vars 29922 1726853659.51891: variable 'omit' from source: magic vars 29922 1726853659.51937: variable 'item' from source: unknown 29922 1726853659.52010: variable 'item' from source: unknown 29922 1726853659.52028: variable 'omit' from source: magic vars 29922 1726853659.52063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853659.52098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853659.52123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853659.52146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853659.52166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853659.52201: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853659.52209: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853659.52218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853659.52310: Set connection var ansible_connection to ssh 29922 1726853659.52322: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853659.52331: Set connection var ansible_shell_executable to /bin/sh 29922 1726853659.52337: Set connection var ansible_pipelining to False 29922 1726853659.52340: Set connection var ansible_timeout to 10 29922 1726853659.52342: Set connection var ansible_shell_type to sh 29922 1726853659.52370: variable 'ansible_shell_executable' from source: unknown 29922 1726853659.52376: variable 'ansible_connection' from source: unknown 29922 1726853659.52378: variable 'ansible_module_compression' from source: unknown 29922 1726853659.52381: variable 'ansible_shell_type' from source: unknown 29922 1726853659.52383: variable 'ansible_shell_executable' from source: unknown 29922 1726853659.52385: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853659.52387: variable 'ansible_pipelining' from source: unknown 29922 1726853659.52389: variable 'ansible_timeout' from source: unknown 29922 1726853659.52391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853659.52490: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853659.52498: variable 'omit' from source: magic vars 29922 1726853659.52502: starting attempt loop 29922 1726853659.52505: running the handler 29922 1726853659.52519: _low_level_execute_command(): starting 29922 1726853659.52525: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853659.53025: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853659.53029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853659.53033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853659.53037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853659.53077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853659.53080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853659.53150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853659.54855: stdout chunk (state=3): >>>/root <<< 29922 1726853659.54982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853659.54985: stdout chunk (state=3): >>><<< 29922 1726853659.54987: stderr chunk (state=3): >>><<< 29922 1726853659.55016: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853659.55073: _low_level_execute_command(): starting 29922 1726853659.55088: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853659.5501456-30345-268150161331435 `" && echo ansible-tmp-1726853659.5501456-30345-268150161331435="` echo /root/.ansible/tmp/ansible-tmp-1726853659.5501456-30345-268150161331435 `" ) && sleep 0' 29922 1726853659.55576: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853659.55580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853659.55582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853659.55611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853659.55615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853659.55617: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853659.55620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853659.55678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853659.55681: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853659.55688: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853659.55723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853659.55734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853659.55751: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853659.55832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853659.57796: stdout chunk (state=3): >>>ansible-tmp-1726853659.5501456-30345-268150161331435=/root/.ansible/tmp/ansible-tmp-1726853659.5501456-30345-268150161331435 <<< 29922 1726853659.57905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853659.57990: stderr chunk (state=3): >>><<< 29922 1726853659.57993: stdout chunk (state=3): >>><<< 29922 1726853659.58192: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853659.5501456-30345-268150161331435=/root/.ansible/tmp/ansible-tmp-1726853659.5501456-30345-268150161331435 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853659.58196: variable 'ansible_module_compression' from source: unknown 29922 1726853659.58198: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29922 1726853659.58200: variable 'ansible_facts' from source: unknown 29922 1726853659.58453: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853659.5501456-30345-268150161331435/AnsiballZ_command.py 29922 1726853659.58811: Sending initial data 29922 1726853659.58816: Sent initial data (156 bytes) 29922 1726853659.59975: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853659.59992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853659.60084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853659.61734: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 29922 1726853659.61762: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853659.61851: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853659.61923: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpgmae0881 /root/.ansible/tmp/ansible-tmp-1726853659.5501456-30345-268150161331435/AnsiballZ_command.py <<< 29922 1726853659.61926: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853659.5501456-30345-268150161331435/AnsiballZ_command.py" <<< 29922 1726853659.62008: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpgmae0881" to remote "/root/.ansible/tmp/ansible-tmp-1726853659.5501456-30345-268150161331435/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853659.5501456-30345-268150161331435/AnsiballZ_command.py" <<< 29922 1726853659.62942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853659.62945: stdout chunk (state=3): >>><<< 29922 1726853659.62948: stderr chunk (state=3): >>><<< 29922 1726853659.62950: done transferring module to remote 29922 1726853659.62968: _low_level_execute_command(): starting 29922 1726853659.63005: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853659.5501456-30345-268150161331435/ /root/.ansible/tmp/ansible-tmp-1726853659.5501456-30345-268150161331435/AnsiballZ_command.py && sleep 0' 29922 1726853659.63675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853659.63708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853659.63790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853659.63804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853659.63868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853659.63904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853659.64001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853659.65972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853659.65977: stdout chunk (state=3): >>><<< 29922 1726853659.65979: stderr chunk (state=3): >>><<< 29922 1726853659.66131: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853659.66135: _low_level_execute_command(): starting 29922 1726853659.66138: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853659.5501456-30345-268150161331435/AnsiballZ_command.py && sleep 0' 29922 1726853659.67165: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853659.67170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853659.67268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853659.67418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853659.83594: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 13:34:19.827634", "end": "2024-09-20 13:34:19.833052", "delta": "0:00:00.005418", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853659.86425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853659.86429: stdout chunk (state=3): >>><<< 29922 1726853659.86431: stderr chunk (state=3): >>><<< 29922 1726853659.86433: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 13:34:19.827634", "end": "2024-09-20 13:34:19.833052", "delta": "0:00:00.005418", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853659.86482: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853659.5501456-30345-268150161331435/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853659.86499: _low_level_execute_command(): starting 29922 1726853659.86510: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853659.5501456-30345-268150161331435/ > /dev/null 2>&1 && sleep 0' 29922 1726853659.87202: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853659.87278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853659.87281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853659.87284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853659.87367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853659.91758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853659.91780: stderr chunk (state=3): >>><<< 29922 1726853659.91784: stdout chunk (state=3): >>><<< 29922 1726853659.91802: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853659.91808: handler run complete 29922 1726853659.91837: Evaluated conditional (False): False 29922 1726853659.91849: attempt loop complete, returning result 29922 1726853659.91862: variable 'item' from source: unknown 29922 1726853659.91953: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.005418", "end": "2024-09-20 13:34:19.833052", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-20 13:34:19.827634" } 29922 1726853659.92133: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853659.92136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853659.92139: variable 'omit' from source: magic vars 29922 1726853659.92209: variable 'ansible_distribution_major_version' from source: facts 29922 1726853659.92213: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853659.92370: variable 'type' from source: set_fact 29922 1726853659.92375: variable 'state' from source: include params 29922 1726853659.92379: variable 'interface' from source: set_fact 29922 1726853659.92382: variable 'current_interfaces' from source: set_fact 29922 1726853659.92392: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 29922 1726853659.92394: variable 'omit' from source: magic vars 29922 1726853659.92404: variable 'omit' from source: magic vars 29922 1726853659.92430: variable 'item' from source: unknown 29922 1726853659.92489: variable 'item' from source: unknown 29922 1726853659.92506: variable 'omit' from source: magic vars 29922 1726853659.92517: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853659.92524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853659.92530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853659.92541: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853659.92544: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853659.92546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853659.92596: Set connection var ansible_connection to ssh 29922 1726853659.92605: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853659.92609: Set connection var ansible_shell_executable to /bin/sh 29922 1726853659.92617: Set connection var ansible_pipelining to False 29922 1726853659.92620: Set connection var ansible_timeout to 10 29922 1726853659.92622: Set connection var ansible_shell_type to sh 29922 1726853659.92639: variable 'ansible_shell_executable' from source: unknown 29922 1726853659.92642: variable 'ansible_connection' from source: unknown 29922 1726853659.92645: variable 'ansible_module_compression' from source: unknown 29922 1726853659.92647: variable 'ansible_shell_type' from source: unknown 29922 1726853659.92649: variable 'ansible_shell_executable' from source: unknown 29922 1726853659.92651: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853659.92655: variable 'ansible_pipelining' from source: unknown 29922 1726853659.92660: variable 'ansible_timeout' from source: unknown 29922 1726853659.92662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853659.92730: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853659.92738: variable 'omit' from source: magic vars 29922 1726853659.92742: starting attempt loop 29922 1726853659.92745: running the handler 29922 1726853659.92751: _low_level_execute_command(): starting 29922 1726853659.92754: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853659.93233: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853659.93236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853659.93245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853659.93247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853659.93318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853659.93384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853659.95023: stdout chunk (state=3): >>>/root <<< 29922 1726853659.95123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853659.95153: stderr chunk (state=3): >>><<< 29922 1726853659.95156: stdout chunk (state=3): >>><<< 29922 1726853659.95173: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853659.95181: _low_level_execute_command(): starting 29922 1726853659.95186: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853659.9517272-30345-69974560300630 `" && echo ansible-tmp-1726853659.9517272-30345-69974560300630="` echo /root/.ansible/tmp/ansible-tmp-1726853659.9517272-30345-69974560300630 `" ) && sleep 0' 29922 1726853659.95641: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853659.95644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853659.95646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853659.95649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853659.95651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853659.95704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853659.95708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853659.95711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853659.95770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853659.97703: stdout chunk (state=3): >>>ansible-tmp-1726853659.9517272-30345-69974560300630=/root/.ansible/tmp/ansible-tmp-1726853659.9517272-30345-69974560300630 <<< 29922 1726853659.97806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853659.97833: stderr chunk (state=3): >>><<< 29922 1726853659.97837: stdout chunk (state=3): >>><<< 29922 1726853659.97850: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853659.9517272-30345-69974560300630=/root/.ansible/tmp/ansible-tmp-1726853659.9517272-30345-69974560300630 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853659.97873: variable 'ansible_module_compression' from source: unknown 29922 1726853659.97906: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29922 1726853659.97925: variable 'ansible_facts' from source: unknown 29922 1726853659.97968: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853659.9517272-30345-69974560300630/AnsiballZ_command.py 29922 1726853659.98066: Sending initial data 29922 1726853659.98070: Sent initial data (155 bytes) 29922 1726853659.98527: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853659.98531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853659.98533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29922 1726853659.98535: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853659.98537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853659.98594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853659.98606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853659.98662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.00267: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853660.00322: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853660.00386: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp2wv0dg7j /root/.ansible/tmp/ansible-tmp-1726853659.9517272-30345-69974560300630/AnsiballZ_command.py <<< 29922 1726853660.00388: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853659.9517272-30345-69974560300630/AnsiballZ_command.py" <<< 29922 1726853660.00437: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp2wv0dg7j" to remote "/root/.ansible/tmp/ansible-tmp-1726853659.9517272-30345-69974560300630/AnsiballZ_command.py" <<< 29922 1726853660.00443: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853659.9517272-30345-69974560300630/AnsiballZ_command.py" <<< 29922 1726853660.01060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853660.01102: stderr chunk (state=3): >>><<< 29922 1726853660.01105: stdout chunk (state=3): >>><<< 29922 1726853660.01132: done transferring module to remote 29922 1726853660.01140: _low_level_execute_command(): starting 29922 1726853660.01144: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853659.9517272-30345-69974560300630/ /root/.ansible/tmp/ansible-tmp-1726853659.9517272-30345-69974560300630/AnsiballZ_command.py && sleep 0' 29922 1726853660.01612: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853660.01616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853660.01618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 29922 1726853660.01620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853660.01622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853660.01675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853660.01679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853660.01683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853660.01744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.03588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853660.03613: stderr chunk (state=3): >>><<< 29922 1726853660.03616: stdout chunk (state=3): >>><<< 29922 1726853660.03631: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853660.03634: _low_level_execute_command(): starting 29922 1726853660.03639: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853659.9517272-30345-69974560300630/AnsiballZ_command.py && sleep 0' 29922 1726853660.04089: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853660.04093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853660.04095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853660.04097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853660.04153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853660.04156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853660.04163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853660.04228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.20087: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 13:34:20.195524", "end": "2024-09-20 13:34:20.199615", "delta": "0:00:00.004091", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853660.21888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853660.21893: stdout chunk (state=3): >>><<< 29922 1726853660.21895: stderr chunk (state=3): >>><<< 29922 1726853660.21898: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 13:34:20.195524", "end": "2024-09-20 13:34:20.199615", "delta": "0:00:00.004091", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853660.21902: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853659.9517272-30345-69974560300630/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853660.21904: _low_level_execute_command(): starting 29922 1726853660.21906: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853659.9517272-30345-69974560300630/ > /dev/null 2>&1 && sleep 0' 29922 1726853660.22519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853660.22549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853660.22565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853660.22586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853660.22660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853660.22714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853660.22738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853660.22839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.24754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853660.24778: stdout chunk (state=3): >>><<< 29922 1726853660.24790: stderr chunk (state=3): >>><<< 29922 1726853660.24811: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853660.24822: handler run complete 29922 1726853660.24847: Evaluated conditional (False): False 29922 1726853660.24862: attempt loop complete, returning result 29922 1726853660.24899: variable 'item' from source: unknown 29922 1726853660.24992: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.004091", "end": "2024-09-20 13:34:20.199615", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-20 13:34:20.195524" } 29922 1726853660.25269: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853660.25274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853660.25276: variable 'omit' from source: magic vars 29922 1726853660.25482: variable 'ansible_distribution_major_version' from source: facts 29922 1726853660.25485: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853660.25614: variable 'type' from source: set_fact 29922 1726853660.25617: variable 'state' from source: include params 29922 1726853660.25620: variable 'interface' from source: set_fact 29922 1726853660.25622: variable 'current_interfaces' from source: set_fact 29922 1726853660.25676: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 29922 1726853660.25679: variable 'omit' from source: magic vars 29922 1726853660.25681: variable 'omit' from source: magic vars 29922 1726853660.25712: variable 'item' from source: unknown 29922 1726853660.25779: variable 'item' from source: unknown 29922 1726853660.25805: variable 'omit' from source: magic vars 29922 1726853660.25839: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853660.25857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853660.25917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853660.25923: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853660.25925: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853660.25928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853660.25994: Set connection var ansible_connection to ssh 29922 1726853660.26005: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853660.26024: Set connection var ansible_shell_executable to /bin/sh 29922 1726853660.26045: Set connection var ansible_pipelining to False 29922 1726853660.26056: Set connection var ansible_timeout to 10 29922 1726853660.26064: Set connection var ansible_shell_type to sh 29922 1726853660.26133: variable 'ansible_shell_executable' from source: unknown 29922 1726853660.26141: variable 'ansible_connection' from source: unknown 29922 1726853660.26143: variable 'ansible_module_compression' from source: unknown 29922 1726853660.26145: variable 'ansible_shell_type' from source: unknown 29922 1726853660.26147: variable 'ansible_shell_executable' from source: unknown 29922 1726853660.26152: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853660.26155: variable 'ansible_pipelining' from source: unknown 29922 1726853660.26156: variable 'ansible_timeout' from source: unknown 29922 1726853660.26158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853660.26242: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853660.26263: variable 'omit' from source: magic vars 29922 1726853660.26354: starting attempt loop 29922 1726853660.26358: running the handler 29922 1726853660.26361: _low_level_execute_command(): starting 29922 1726853660.26363: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853660.27006: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853660.27025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853660.27116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853660.27145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853660.27250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.28923: stdout chunk (state=3): >>>/root <<< 29922 1726853660.29075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853660.29079: stdout chunk (state=3): >>><<< 29922 1726853660.29081: stderr chunk (state=3): >>><<< 29922 1726853660.29177: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853660.29180: _low_level_execute_command(): starting 29922 1726853660.29183: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853660.2910607-30345-131330948908463 `" && echo ansible-tmp-1726853660.2910607-30345-131330948908463="` echo /root/.ansible/tmp/ansible-tmp-1726853660.2910607-30345-131330948908463 `" ) && sleep 0' 29922 1726853660.29763: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853660.29780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853660.29793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853660.29842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853660.29856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853660.29869: stderr chunk (state=3): >>>debug2: match found <<< 29922 1726853660.29953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853660.29968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853660.29987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853660.30014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853660.30097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.32065: stdout chunk (state=3): >>>ansible-tmp-1726853660.2910607-30345-131330948908463=/root/.ansible/tmp/ansible-tmp-1726853660.2910607-30345-131330948908463 <<< 29922 1726853660.32175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853660.32199: stderr chunk (state=3): >>><<< 29922 1726853660.32204: stdout chunk (state=3): >>><<< 29922 1726853660.32223: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853660.2910607-30345-131330948908463=/root/.ansible/tmp/ansible-tmp-1726853660.2910607-30345-131330948908463 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853660.32242: variable 'ansible_module_compression' from source: unknown 29922 1726853660.32282: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29922 1726853660.32301: variable 'ansible_facts' from source: unknown 29922 1726853660.32359: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853660.2910607-30345-131330948908463/AnsiballZ_command.py 29922 1726853660.32461: Sending initial data 29922 1726853660.32465: Sent initial data (156 bytes) 29922 1726853660.33094: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853660.33146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853660.33150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853660.33214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.34828: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 29922 1726853660.34834: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853660.34883: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853660.34946: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpfzvvdlcx /root/.ansible/tmp/ansible-tmp-1726853660.2910607-30345-131330948908463/AnsiballZ_command.py <<< 29922 1726853660.34950: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853660.2910607-30345-131330948908463/AnsiballZ_command.py" <<< 29922 1726853660.34999: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpfzvvdlcx" to remote "/root/.ansible/tmp/ansible-tmp-1726853660.2910607-30345-131330948908463/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853660.2910607-30345-131330948908463/AnsiballZ_command.py" <<< 29922 1726853660.35591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853660.35627: stderr chunk (state=3): >>><<< 29922 1726853660.35630: stdout chunk (state=3): >>><<< 29922 1726853660.35657: done transferring module to remote 29922 1726853660.35666: _low_level_execute_command(): starting 29922 1726853660.35672: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853660.2910607-30345-131330948908463/ /root/.ansible/tmp/ansible-tmp-1726853660.2910607-30345-131330948908463/AnsiballZ_command.py && sleep 0' 29922 1726853660.36279: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853660.36294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853660.36309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853660.36395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.38238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853660.38266: stderr chunk (state=3): >>><<< 29922 1726853660.38269: stdout chunk (state=3): >>><<< 29922 1726853660.38285: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853660.38288: _low_level_execute_command(): starting 29922 1726853660.38291: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853660.2910607-30345-131330948908463/AnsiballZ_command.py && sleep 0' 29922 1726853660.38708: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853660.38711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853660.38714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853660.38716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853660.38722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853660.38768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853660.38772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853660.38841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.54738: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 13:34:20.541766", "end": "2024-09-20 13:34:20.545716", "delta": "0:00:00.003950", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853660.56469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853660.56475: stdout chunk (state=3): >>><<< 29922 1726853660.56478: stderr chunk (state=3): >>><<< 29922 1726853660.56498: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 13:34:20.541766", "end": "2024-09-20 13:34:20.545716", "delta": "0:00:00.003950", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853660.56531: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853660.2910607-30345-131330948908463/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853660.56541: _low_level_execute_command(): starting 29922 1726853660.56576: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853660.2910607-30345-131330948908463/ > /dev/null 2>&1 && sleep 0' 29922 1726853660.57202: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853660.57218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853660.57241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853660.57353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853660.57381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853660.57396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853660.57483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.59393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853660.59443: stderr chunk (state=3): >>><<< 29922 1726853660.59461: stdout chunk (state=3): >>><<< 29922 1726853660.59490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853660.59501: handler run complete 29922 1726853660.59528: Evaluated conditional (False): False 29922 1726853660.59559: attempt loop complete, returning result 29922 1726853660.59666: variable 'item' from source: unknown 29922 1726853660.59672: variable 'item' from source: unknown ok: [managed_node3] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.003950", "end": "2024-09-20 13:34:20.545716", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-20 13:34:20.541766" } 29922 1726853660.59903: dumping result to json 29922 1726853660.59906: done dumping result, returning 29922 1726853660.59908: done running TaskExecutor() for managed_node3/TASK: Create veth interface ethtest0 [02083763-bbaf-51d4-513b-0000000001d0] 29922 1726853660.59915: sending task result for task 02083763-bbaf-51d4-513b-0000000001d0 29922 1726853660.60899: done sending task result for task 02083763-bbaf-51d4-513b-0000000001d0 29922 1726853660.60911: WORKER PROCESS EXITING 29922 1726853660.61025: no more pending results, returning what we have 29922 1726853660.61028: results queue empty 29922 1726853660.61029: checking for any_errors_fatal 29922 1726853660.61032: done checking for any_errors_fatal 29922 1726853660.61032: checking for max_fail_percentage 29922 1726853660.61034: done checking for max_fail_percentage 29922 1726853660.61035: checking to see if all hosts have failed and the running result is not ok 29922 1726853660.61036: done checking to see if all hosts have failed 29922 1726853660.61036: getting the remaining hosts for this loop 29922 1726853660.61037: done getting the remaining hosts for this loop 29922 1726853660.61041: getting the next task for host managed_node3 29922 1726853660.61045: done getting next task for host managed_node3 29922 1726853660.61047: ^ task is: TASK: Set up veth as managed by NetworkManager 29922 1726853660.61049: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853660.61052: getting variables 29922 1726853660.61053: in VariableManager get_vars() 29922 1726853660.61093: Calling all_inventory to load vars for managed_node3 29922 1726853660.61096: Calling groups_inventory to load vars for managed_node3 29922 1726853660.61098: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853660.61108: Calling all_plugins_play to load vars for managed_node3 29922 1726853660.61111: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853660.61114: Calling groups_plugins_play to load vars for managed_node3 29922 1726853660.61413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853660.61676: done with get_vars() 29922 1726853660.61685: done getting variables 29922 1726853660.61751: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 13:34:20 -0400 (0:00:01.121) 0:00:09.547 ****** 29922 1726853660.61779: entering _queue_task() for managed_node3/command 29922 1726853660.62122: worker is 1 (out of 1 available) 29922 1726853660.62136: exiting _queue_task() for managed_node3/command 29922 1726853660.62149: done queuing things up, now waiting for results queue to drain 29922 1726853660.62151: waiting for pending results... 29922 1726853660.62353: running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager 29922 1726853660.62426: in run() - task 02083763-bbaf-51d4-513b-0000000001d1 29922 1726853660.62437: variable 'ansible_search_path' from source: unknown 29922 1726853660.62441: variable 'ansible_search_path' from source: unknown 29922 1726853660.62481: calling self._execute() 29922 1726853660.62553: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853660.62557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853660.62570: variable 'omit' from source: magic vars 29922 1726853660.62850: variable 'ansible_distribution_major_version' from source: facts 29922 1726853660.62860: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853660.62969: variable 'type' from source: set_fact 29922 1726853660.62974: variable 'state' from source: include params 29922 1726853660.62978: Evaluated conditional (type == 'veth' and state == 'present'): True 29922 1726853660.62985: variable 'omit' from source: magic vars 29922 1726853660.63011: variable 'omit' from source: magic vars 29922 1726853660.63082: variable 'interface' from source: set_fact 29922 1726853660.63096: variable 'omit' from source: magic vars 29922 1726853660.63128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853660.63159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853660.63179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853660.63192: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853660.63201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853660.63224: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853660.63227: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853660.63229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853660.63301: Set connection var ansible_connection to ssh 29922 1726853660.63307: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853660.63314: Set connection var ansible_shell_executable to /bin/sh 29922 1726853660.63321: Set connection var ansible_pipelining to False 29922 1726853660.63326: Set connection var ansible_timeout to 10 29922 1726853660.63329: Set connection var ansible_shell_type to sh 29922 1726853660.63347: variable 'ansible_shell_executable' from source: unknown 29922 1726853660.63351: variable 'ansible_connection' from source: unknown 29922 1726853660.63354: variable 'ansible_module_compression' from source: unknown 29922 1726853660.63358: variable 'ansible_shell_type' from source: unknown 29922 1726853660.63361: variable 'ansible_shell_executable' from source: unknown 29922 1726853660.63364: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853660.63366: variable 'ansible_pipelining' from source: unknown 29922 1726853660.63368: variable 'ansible_timeout' from source: unknown 29922 1726853660.63372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853660.63475: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853660.63486: variable 'omit' from source: magic vars 29922 1726853660.63491: starting attempt loop 29922 1726853660.63493: running the handler 29922 1726853660.63505: _low_level_execute_command(): starting 29922 1726853660.63512: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853660.64033: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853660.64039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853660.64042: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853660.64045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853660.64097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853660.64102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853660.64106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853660.64169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.65911: stdout chunk (state=3): >>>/root <<< 29922 1726853660.66048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853660.66053: stdout chunk (state=3): >>><<< 29922 1726853660.66055: stderr chunk (state=3): >>><<< 29922 1726853660.66180: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853660.66185: _low_level_execute_command(): starting 29922 1726853660.66189: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853660.6608229-30395-8977773275527 `" && echo ansible-tmp-1726853660.6608229-30395-8977773275527="` echo /root/.ansible/tmp/ansible-tmp-1726853660.6608229-30395-8977773275527 `" ) && sleep 0' 29922 1726853660.66728: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853660.66744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853660.66768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853660.66825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853660.66898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853660.66950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853660.67043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.69016: stdout chunk (state=3): >>>ansible-tmp-1726853660.6608229-30395-8977773275527=/root/.ansible/tmp/ansible-tmp-1726853660.6608229-30395-8977773275527 <<< 29922 1726853660.69187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853660.69191: stdout chunk (state=3): >>><<< 29922 1726853660.69193: stderr chunk (state=3): >>><<< 29922 1726853660.69204: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853660.6608229-30395-8977773275527=/root/.ansible/tmp/ansible-tmp-1726853660.6608229-30395-8977773275527 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853660.69256: variable 'ansible_module_compression' from source: unknown 29922 1726853660.69293: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29922 1726853660.69329: variable 'ansible_facts' from source: unknown 29922 1726853660.69394: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853660.6608229-30395-8977773275527/AnsiballZ_command.py 29922 1726853660.69588: Sending initial data 29922 1726853660.69591: Sent initial data (154 bytes) 29922 1726853660.70169: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29922 1726853660.70189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853660.70290: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853660.70309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853660.70405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.71996: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853660.72065: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853660.72158: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmppw4v_zwk /root/.ansible/tmp/ansible-tmp-1726853660.6608229-30395-8977773275527/AnsiballZ_command.py <<< 29922 1726853660.72168: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853660.6608229-30395-8977773275527/AnsiballZ_command.py" <<< 29922 1726853660.72221: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmppw4v_zwk" to remote "/root/.ansible/tmp/ansible-tmp-1726853660.6608229-30395-8977773275527/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853660.6608229-30395-8977773275527/AnsiballZ_command.py" <<< 29922 1726853660.72936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853660.73066: stderr chunk (state=3): >>><<< 29922 1726853660.73069: stdout chunk (state=3): >>><<< 29922 1726853660.73077: done transferring module to remote 29922 1726853660.73084: _low_level_execute_command(): starting 29922 1726853660.73087: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853660.6608229-30395-8977773275527/ /root/.ansible/tmp/ansible-tmp-1726853660.6608229-30395-8977773275527/AnsiballZ_command.py && sleep 0' 29922 1726853660.74033: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853660.74055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853660.74076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853660.74096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853660.74114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853660.74127: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853660.74190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853660.74244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853660.74266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853660.74294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853660.74386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.76480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853660.76488: stdout chunk (state=3): >>><<< 29922 1726853660.76491: stderr chunk (state=3): >>><<< 29922 1726853660.76497: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853660.76655: _low_level_execute_command(): starting 29922 1726853660.76661: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853660.6608229-30395-8977773275527/AnsiballZ_command.py && sleep 0' 29922 1726853660.77377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853660.77381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853660.77384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853660.77391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853660.77487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853660.77558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.94956: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 13:34:20.928418", "end": "2024-09-20 13:34:20.947241", "delta": "0:00:00.018823", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853660.96678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853660.96682: stdout chunk (state=3): >>><<< 29922 1726853660.96685: stderr chunk (state=3): >>><<< 29922 1726853660.96688: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 13:34:20.928418", "end": "2024-09-20 13:34:20.947241", "delta": "0:00:00.018823", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853660.96777: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853660.6608229-30395-8977773275527/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853660.96792: _low_level_execute_command(): starting 29922 1726853660.96802: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853660.6608229-30395-8977773275527/ > /dev/null 2>&1 && sleep 0' 29922 1726853660.97579: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853660.97596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853660.97615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853660.97639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853660.97840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853660.97999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853660.98079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853660.99947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853660.99979: stderr chunk (state=3): >>><<< 29922 1726853660.99985: stdout chunk (state=3): >>><<< 29922 1726853661.00000: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853661.00006: handler run complete 29922 1726853661.00026: Evaluated conditional (False): False 29922 1726853661.00042: attempt loop complete, returning result 29922 1726853661.00045: _execute() done 29922 1726853661.00048: dumping result to json 29922 1726853661.00050: done dumping result, returning 29922 1726853661.00052: done running TaskExecutor() for managed_node3/TASK: Set up veth as managed by NetworkManager [02083763-bbaf-51d4-513b-0000000001d1] 29922 1726853661.00060: sending task result for task 02083763-bbaf-51d4-513b-0000000001d1 29922 1726853661.00162: done sending task result for task 02083763-bbaf-51d4-513b-0000000001d1 29922 1726853661.00164: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.018823", "end": "2024-09-20 13:34:20.947241", "rc": 0, "start": "2024-09-20 13:34:20.928418" } 29922 1726853661.00229: no more pending results, returning what we have 29922 1726853661.00233: results queue empty 29922 1726853661.00233: checking for any_errors_fatal 29922 1726853661.00248: done checking for any_errors_fatal 29922 1726853661.00248: checking for max_fail_percentage 29922 1726853661.00250: done checking for max_fail_percentage 29922 1726853661.00251: checking to see if all hosts have failed and the running result is not ok 29922 1726853661.00252: done checking to see if all hosts have failed 29922 1726853661.00252: getting the remaining hosts for this loop 29922 1726853661.00253: done getting the remaining hosts for this loop 29922 1726853661.00260: getting the next task for host managed_node3 29922 1726853661.00265: done getting next task for host managed_node3 29922 1726853661.00268: ^ task is: TASK: Delete veth interface {{ interface }} 29922 1726853661.00270: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853661.00276: getting variables 29922 1726853661.00277: in VariableManager get_vars() 29922 1726853661.00313: Calling all_inventory to load vars for managed_node3 29922 1726853661.00315: Calling groups_inventory to load vars for managed_node3 29922 1726853661.00317: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853661.00326: Calling all_plugins_play to load vars for managed_node3 29922 1726853661.00329: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853661.00331: Calling groups_plugins_play to load vars for managed_node3 29922 1726853661.00486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853661.00645: done with get_vars() 29922 1726853661.00652: done getting variables 29922 1726853661.00702: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853661.00789: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 13:34:21 -0400 (0:00:00.390) 0:00:09.937 ****** 29922 1726853661.00817: entering _queue_task() for managed_node3/command 29922 1726853661.01031: worker is 1 (out of 1 available) 29922 1726853661.01047: exiting _queue_task() for managed_node3/command 29922 1726853661.01062: done queuing things up, now waiting for results queue to drain 29922 1726853661.01063: waiting for pending results... 29922 1726853661.01213: running TaskExecutor() for managed_node3/TASK: Delete veth interface ethtest0 29922 1726853661.01280: in run() - task 02083763-bbaf-51d4-513b-0000000001d2 29922 1726853661.01293: variable 'ansible_search_path' from source: unknown 29922 1726853661.01300: variable 'ansible_search_path' from source: unknown 29922 1726853661.01331: calling self._execute() 29922 1726853661.01399: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.01408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.01414: variable 'omit' from source: magic vars 29922 1726853661.01893: variable 'ansible_distribution_major_version' from source: facts 29922 1726853661.01896: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853661.01969: variable 'type' from source: set_fact 29922 1726853661.02077: variable 'state' from source: include params 29922 1726853661.02082: variable 'interface' from source: set_fact 29922 1726853661.02085: variable 'current_interfaces' from source: set_fact 29922 1726853661.02088: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 29922 1726853661.02090: when evaluation is False, skipping this task 29922 1726853661.02092: _execute() done 29922 1726853661.02094: dumping result to json 29922 1726853661.02096: done dumping result, returning 29922 1726853661.02098: done running TaskExecutor() for managed_node3/TASK: Delete veth interface ethtest0 [02083763-bbaf-51d4-513b-0000000001d2] 29922 1726853661.02100: sending task result for task 02083763-bbaf-51d4-513b-0000000001d2 29922 1726853661.02164: done sending task result for task 02083763-bbaf-51d4-513b-0000000001d2 29922 1726853661.02167: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 29922 1726853661.02220: no more pending results, returning what we have 29922 1726853661.02224: results queue empty 29922 1726853661.02225: checking for any_errors_fatal 29922 1726853661.02236: done checking for any_errors_fatal 29922 1726853661.02237: checking for max_fail_percentage 29922 1726853661.02238: done checking for max_fail_percentage 29922 1726853661.02239: checking to see if all hosts have failed and the running result is not ok 29922 1726853661.02240: done checking to see if all hosts have failed 29922 1726853661.02241: getting the remaining hosts for this loop 29922 1726853661.02243: done getting the remaining hosts for this loop 29922 1726853661.02247: getting the next task for host managed_node3 29922 1726853661.02252: done getting next task for host managed_node3 29922 1726853661.02256: ^ task is: TASK: Create dummy interface {{ interface }} 29922 1726853661.02262: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853661.02266: getting variables 29922 1726853661.02267: in VariableManager get_vars() 29922 1726853661.02307: Calling all_inventory to load vars for managed_node3 29922 1726853661.02309: Calling groups_inventory to load vars for managed_node3 29922 1726853661.02311: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853661.02324: Calling all_plugins_play to load vars for managed_node3 29922 1726853661.02327: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853661.02330: Calling groups_plugins_play to load vars for managed_node3 29922 1726853661.02747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853661.02893: done with get_vars() 29922 1726853661.02900: done getting variables 29922 1726853661.02950: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853661.03035: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 13:34:21 -0400 (0:00:00.022) 0:00:09.960 ****** 29922 1726853661.03062: entering _queue_task() for managed_node3/command 29922 1726853661.03279: worker is 1 (out of 1 available) 29922 1726853661.03294: exiting _queue_task() for managed_node3/command 29922 1726853661.03306: done queuing things up, now waiting for results queue to drain 29922 1726853661.03307: waiting for pending results... 29922 1726853661.03457: running TaskExecutor() for managed_node3/TASK: Create dummy interface ethtest0 29922 1726853661.03534: in run() - task 02083763-bbaf-51d4-513b-0000000001d3 29922 1726853661.03543: variable 'ansible_search_path' from source: unknown 29922 1726853661.03547: variable 'ansible_search_path' from source: unknown 29922 1726853661.03581: calling self._execute() 29922 1726853661.03642: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.03648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.03656: variable 'omit' from source: magic vars 29922 1726853661.03924: variable 'ansible_distribution_major_version' from source: facts 29922 1726853661.03933: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853661.04065: variable 'type' from source: set_fact 29922 1726853661.04068: variable 'state' from source: include params 29922 1726853661.04073: variable 'interface' from source: set_fact 29922 1726853661.04078: variable 'current_interfaces' from source: set_fact 29922 1726853661.04090: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 29922 1726853661.04093: when evaluation is False, skipping this task 29922 1726853661.04096: _execute() done 29922 1726853661.04098: dumping result to json 29922 1726853661.04101: done dumping result, returning 29922 1726853661.04105: done running TaskExecutor() for managed_node3/TASK: Create dummy interface ethtest0 [02083763-bbaf-51d4-513b-0000000001d3] 29922 1726853661.04110: sending task result for task 02083763-bbaf-51d4-513b-0000000001d3 29922 1726853661.04186: done sending task result for task 02083763-bbaf-51d4-513b-0000000001d3 29922 1726853661.04190: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 29922 1726853661.04231: no more pending results, returning what we have 29922 1726853661.04236: results queue empty 29922 1726853661.04236: checking for any_errors_fatal 29922 1726853661.04243: done checking for any_errors_fatal 29922 1726853661.04243: checking for max_fail_percentage 29922 1726853661.04245: done checking for max_fail_percentage 29922 1726853661.04246: checking to see if all hosts have failed and the running result is not ok 29922 1726853661.04247: done checking to see if all hosts have failed 29922 1726853661.04247: getting the remaining hosts for this loop 29922 1726853661.04248: done getting the remaining hosts for this loop 29922 1726853661.04251: getting the next task for host managed_node3 29922 1726853661.04256: done getting next task for host managed_node3 29922 1726853661.04259: ^ task is: TASK: Delete dummy interface {{ interface }} 29922 1726853661.04261: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853661.04266: getting variables 29922 1726853661.04267: in VariableManager get_vars() 29922 1726853661.04298: Calling all_inventory to load vars for managed_node3 29922 1726853661.04300: Calling groups_inventory to load vars for managed_node3 29922 1726853661.04302: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853661.04311: Calling all_plugins_play to load vars for managed_node3 29922 1726853661.04313: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853661.04316: Calling groups_plugins_play to load vars for managed_node3 29922 1726853661.04481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853661.04620: done with get_vars() 29922 1726853661.04627: done getting variables 29922 1726853661.04669: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853661.04839: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 13:34:21 -0400 (0:00:00.018) 0:00:09.978 ****** 29922 1726853661.04882: entering _queue_task() for managed_node3/command 29922 1726853661.05154: worker is 1 (out of 1 available) 29922 1726853661.05170: exiting _queue_task() for managed_node3/command 29922 1726853661.05186: done queuing things up, now waiting for results queue to drain 29922 1726853661.05187: waiting for pending results... 29922 1726853661.05429: running TaskExecutor() for managed_node3/TASK: Delete dummy interface ethtest0 29922 1726853661.05578: in run() - task 02083763-bbaf-51d4-513b-0000000001d4 29922 1726853661.05583: variable 'ansible_search_path' from source: unknown 29922 1726853661.05587: variable 'ansible_search_path' from source: unknown 29922 1726853661.05590: calling self._execute() 29922 1726853661.05660: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.05674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.05687: variable 'omit' from source: magic vars 29922 1726853661.06028: variable 'ansible_distribution_major_version' from source: facts 29922 1726853661.06046: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853661.06242: variable 'type' from source: set_fact 29922 1726853661.06261: variable 'state' from source: include params 29922 1726853661.06288: variable 'interface' from source: set_fact 29922 1726853661.06292: variable 'current_interfaces' from source: set_fact 29922 1726853661.06295: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 29922 1726853661.06297: when evaluation is False, skipping this task 29922 1726853661.06300: _execute() done 29922 1726853661.06302: dumping result to json 29922 1726853661.06304: done dumping result, returning 29922 1726853661.06306: done running TaskExecutor() for managed_node3/TASK: Delete dummy interface ethtest0 [02083763-bbaf-51d4-513b-0000000001d4] 29922 1726853661.06308: sending task result for task 02083763-bbaf-51d4-513b-0000000001d4 skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 29922 1726853661.06443: no more pending results, returning what we have 29922 1726853661.06446: results queue empty 29922 1726853661.06446: checking for any_errors_fatal 29922 1726853661.06477: done checking for any_errors_fatal 29922 1726853661.06479: checking for max_fail_percentage 29922 1726853661.06480: done checking for max_fail_percentage 29922 1726853661.06481: checking to see if all hosts have failed and the running result is not ok 29922 1726853661.06482: done checking to see if all hosts have failed 29922 1726853661.06483: getting the remaining hosts for this loop 29922 1726853661.06484: done getting the remaining hosts for this loop 29922 1726853661.06487: getting the next task for host managed_node3 29922 1726853661.06491: done getting next task for host managed_node3 29922 1726853661.06494: ^ task is: TASK: Create tap interface {{ interface }} 29922 1726853661.06496: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853661.06500: getting variables 29922 1726853661.06501: in VariableManager get_vars() 29922 1726853661.06528: Calling all_inventory to load vars for managed_node3 29922 1726853661.06530: Calling groups_inventory to load vars for managed_node3 29922 1726853661.06532: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853661.06545: Calling all_plugins_play to load vars for managed_node3 29922 1726853661.06549: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853661.06555: Calling groups_plugins_play to load vars for managed_node3 29922 1726853661.06734: done sending task result for task 02083763-bbaf-51d4-513b-0000000001d4 29922 1726853661.06738: WORKER PROCESS EXITING 29922 1726853661.06748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853661.06889: done with get_vars() 29922 1726853661.06899: done getting variables 29922 1726853661.06941: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853661.07019: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 13:34:21 -0400 (0:00:00.021) 0:00:09.999 ****** 29922 1726853661.07042: entering _queue_task() for managed_node3/command 29922 1726853661.07229: worker is 1 (out of 1 available) 29922 1726853661.07241: exiting _queue_task() for managed_node3/command 29922 1726853661.07252: done queuing things up, now waiting for results queue to drain 29922 1726853661.07270: waiting for pending results... 29922 1726853661.07433: running TaskExecutor() for managed_node3/TASK: Create tap interface ethtest0 29922 1726853661.07506: in run() - task 02083763-bbaf-51d4-513b-0000000001d5 29922 1726853661.07516: variable 'ansible_search_path' from source: unknown 29922 1726853661.07520: variable 'ansible_search_path' from source: unknown 29922 1726853661.07547: calling self._execute() 29922 1726853661.07643: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.07658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.07668: variable 'omit' from source: magic vars 29922 1726853661.08273: variable 'ansible_distribution_major_version' from source: facts 29922 1726853661.08277: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853661.08280: variable 'type' from source: set_fact 29922 1726853661.08282: variable 'state' from source: include params 29922 1726853661.08284: variable 'interface' from source: set_fact 29922 1726853661.08285: variable 'current_interfaces' from source: set_fact 29922 1726853661.08288: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 29922 1726853661.08289: when evaluation is False, skipping this task 29922 1726853661.08291: _execute() done 29922 1726853661.08293: dumping result to json 29922 1726853661.08294: done dumping result, returning 29922 1726853661.08296: done running TaskExecutor() for managed_node3/TASK: Create tap interface ethtest0 [02083763-bbaf-51d4-513b-0000000001d5] 29922 1726853661.08298: sending task result for task 02083763-bbaf-51d4-513b-0000000001d5 29922 1726853661.08353: done sending task result for task 02083763-bbaf-51d4-513b-0000000001d5 29922 1726853661.08355: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 29922 1726853661.08403: no more pending results, returning what we have 29922 1726853661.08406: results queue empty 29922 1726853661.08406: checking for any_errors_fatal 29922 1726853661.08411: done checking for any_errors_fatal 29922 1726853661.08412: checking for max_fail_percentage 29922 1726853661.08413: done checking for max_fail_percentage 29922 1726853661.08414: checking to see if all hosts have failed and the running result is not ok 29922 1726853661.08415: done checking to see if all hosts have failed 29922 1726853661.08416: getting the remaining hosts for this loop 29922 1726853661.08417: done getting the remaining hosts for this loop 29922 1726853661.08420: getting the next task for host managed_node3 29922 1726853661.08424: done getting next task for host managed_node3 29922 1726853661.08426: ^ task is: TASK: Delete tap interface {{ interface }} 29922 1726853661.08429: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853661.08432: getting variables 29922 1726853661.08434: in VariableManager get_vars() 29922 1726853661.08464: Calling all_inventory to load vars for managed_node3 29922 1726853661.08476: Calling groups_inventory to load vars for managed_node3 29922 1726853661.08479: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853661.08488: Calling all_plugins_play to load vars for managed_node3 29922 1726853661.08491: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853661.08493: Calling groups_plugins_play to load vars for managed_node3 29922 1726853661.08750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853661.08992: done with get_vars() 29922 1726853661.09004: done getting variables 29922 1726853661.09076: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853661.09203: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 13:34:21 -0400 (0:00:00.022) 0:00:10.022 ****** 29922 1726853661.09247: entering _queue_task() for managed_node3/command 29922 1726853661.09490: worker is 1 (out of 1 available) 29922 1726853661.09528: exiting _queue_task() for managed_node3/command 29922 1726853661.09540: done queuing things up, now waiting for results queue to drain 29922 1726853661.09541: waiting for pending results... 29922 1726853661.09784: running TaskExecutor() for managed_node3/TASK: Delete tap interface ethtest0 29922 1726853661.09887: in run() - task 02083763-bbaf-51d4-513b-0000000001d6 29922 1726853661.09891: variable 'ansible_search_path' from source: unknown 29922 1726853661.09894: variable 'ansible_search_path' from source: unknown 29922 1726853661.09924: calling self._execute() 29922 1726853661.10015: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.10027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.10059: variable 'omit' from source: magic vars 29922 1726853661.10441: variable 'ansible_distribution_major_version' from source: facts 29922 1726853661.10458: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853661.10674: variable 'type' from source: set_fact 29922 1726853661.10693: variable 'state' from source: include params 29922 1726853661.10724: variable 'interface' from source: set_fact 29922 1726853661.10734: variable 'current_interfaces' from source: set_fact 29922 1726853661.10747: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 29922 1726853661.10763: when evaluation is False, skipping this task 29922 1726853661.10965: _execute() done 29922 1726853661.10968: dumping result to json 29922 1726853661.10970: done dumping result, returning 29922 1726853661.10974: done running TaskExecutor() for managed_node3/TASK: Delete tap interface ethtest0 [02083763-bbaf-51d4-513b-0000000001d6] 29922 1726853661.10976: sending task result for task 02083763-bbaf-51d4-513b-0000000001d6 29922 1726853661.11028: done sending task result for task 02083763-bbaf-51d4-513b-0000000001d6 29922 1726853661.11031: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 29922 1726853661.11076: no more pending results, returning what we have 29922 1726853661.11078: results queue empty 29922 1726853661.11079: checking for any_errors_fatal 29922 1726853661.11084: done checking for any_errors_fatal 29922 1726853661.11085: checking for max_fail_percentage 29922 1726853661.11086: done checking for max_fail_percentage 29922 1726853661.11087: checking to see if all hosts have failed and the running result is not ok 29922 1726853661.11088: done checking to see if all hosts have failed 29922 1726853661.11089: getting the remaining hosts for this loop 29922 1726853661.11090: done getting the remaining hosts for this loop 29922 1726853661.11093: getting the next task for host managed_node3 29922 1726853661.11100: done getting next task for host managed_node3 29922 1726853661.11103: ^ task is: TASK: Include the task 'assert_device_present.yml' 29922 1726853661.11106: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853661.11114: getting variables 29922 1726853661.11115: in VariableManager get_vars() 29922 1726853661.11152: Calling all_inventory to load vars for managed_node3 29922 1726853661.11155: Calling groups_inventory to load vars for managed_node3 29922 1726853661.11157: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853661.11167: Calling all_plugins_play to load vars for managed_node3 29922 1726853661.11169: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853661.11173: Calling groups_plugins_play to load vars for managed_node3 29922 1726853661.11310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853661.11436: done with get_vars() 29922 1726853661.11442: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:20 Friday 20 September 2024 13:34:21 -0400 (0:00:00.022) 0:00:10.044 ****** 29922 1726853661.11507: entering _queue_task() for managed_node3/include_tasks 29922 1726853661.11697: worker is 1 (out of 1 available) 29922 1726853661.11713: exiting _queue_task() for managed_node3/include_tasks 29922 1726853661.11725: done queuing things up, now waiting for results queue to drain 29922 1726853661.11727: waiting for pending results... 29922 1726853661.11878: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' 29922 1726853661.11970: in run() - task 02083763-bbaf-51d4-513b-00000000000e 29922 1726853661.12002: variable 'ansible_search_path' from source: unknown 29922 1726853661.12021: calling self._execute() 29922 1726853661.12122: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.12128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.12136: variable 'omit' from source: magic vars 29922 1726853661.12468: variable 'ansible_distribution_major_version' from source: facts 29922 1726853661.12482: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853661.12490: _execute() done 29922 1726853661.12493: dumping result to json 29922 1726853661.12496: done dumping result, returning 29922 1726853661.12510: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_present.yml' [02083763-bbaf-51d4-513b-00000000000e] 29922 1726853661.12517: sending task result for task 02083763-bbaf-51d4-513b-00000000000e 29922 1726853661.12592: done sending task result for task 02083763-bbaf-51d4-513b-00000000000e 29922 1726853661.12595: WORKER PROCESS EXITING 29922 1726853661.12649: no more pending results, returning what we have 29922 1726853661.12653: in VariableManager get_vars() 29922 1726853661.12688: Calling all_inventory to load vars for managed_node3 29922 1726853661.12690: Calling groups_inventory to load vars for managed_node3 29922 1726853661.12692: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853661.12701: Calling all_plugins_play to load vars for managed_node3 29922 1726853661.12703: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853661.12706: Calling groups_plugins_play to load vars for managed_node3 29922 1726853661.12925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853661.13113: done with get_vars() 29922 1726853661.13121: variable 'ansible_search_path' from source: unknown 29922 1726853661.13133: we have included files to process 29922 1726853661.13134: generating all_blocks data 29922 1726853661.13135: done generating all_blocks data 29922 1726853661.13139: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 29922 1726853661.13140: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 29922 1726853661.13142: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 29922 1726853661.13290: in VariableManager get_vars() 29922 1726853661.13308: done with get_vars() 29922 1726853661.13407: done processing included file 29922 1726853661.13409: iterating over new_blocks loaded from include file 29922 1726853661.13411: in VariableManager get_vars() 29922 1726853661.13425: done with get_vars() 29922 1726853661.13427: filtering new block on tags 29922 1726853661.13444: done filtering new block on tags 29922 1726853661.13447: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node3 29922 1726853661.13451: extending task lists for all hosts with included blocks 29922 1726853661.14882: done extending task lists 29922 1726853661.14883: done processing included files 29922 1726853661.14884: results queue empty 29922 1726853661.14884: checking for any_errors_fatal 29922 1726853661.14886: done checking for any_errors_fatal 29922 1726853661.14887: checking for max_fail_percentage 29922 1726853661.14888: done checking for max_fail_percentage 29922 1726853661.14888: checking to see if all hosts have failed and the running result is not ok 29922 1726853661.14888: done checking to see if all hosts have failed 29922 1726853661.14889: getting the remaining hosts for this loop 29922 1726853661.14890: done getting the remaining hosts for this loop 29922 1726853661.14891: getting the next task for host managed_node3 29922 1726853661.14894: done getting next task for host managed_node3 29922 1726853661.14895: ^ task is: TASK: Include the task 'get_interface_stat.yml' 29922 1726853661.14896: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853661.14898: getting variables 29922 1726853661.14898: in VariableManager get_vars() 29922 1726853661.14906: Calling all_inventory to load vars for managed_node3 29922 1726853661.14907: Calling groups_inventory to load vars for managed_node3 29922 1726853661.14911: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853661.14918: Calling all_plugins_play to load vars for managed_node3 29922 1726853661.14921: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853661.14924: Calling groups_plugins_play to load vars for managed_node3 29922 1726853661.15028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853661.15176: done with get_vars() 29922 1726853661.15183: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:34:21 -0400 (0:00:00.037) 0:00:10.082 ****** 29922 1726853661.15247: entering _queue_task() for managed_node3/include_tasks 29922 1726853661.15460: worker is 1 (out of 1 available) 29922 1726853661.15480: exiting _queue_task() for managed_node3/include_tasks 29922 1726853661.15493: done queuing things up, now waiting for results queue to drain 29922 1726853661.15495: waiting for pending results... 29922 1726853661.15659: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 29922 1726853661.15718: in run() - task 02083763-bbaf-51d4-513b-0000000002ec 29922 1726853661.15728: variable 'ansible_search_path' from source: unknown 29922 1726853661.15737: variable 'ansible_search_path' from source: unknown 29922 1726853661.15779: calling self._execute() 29922 1726853661.15843: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.15848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.15860: variable 'omit' from source: magic vars 29922 1726853661.16151: variable 'ansible_distribution_major_version' from source: facts 29922 1726853661.16172: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853661.16177: _execute() done 29922 1726853661.16181: dumping result to json 29922 1726853661.16184: done dumping result, returning 29922 1726853661.16195: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-51d4-513b-0000000002ec] 29922 1726853661.16198: sending task result for task 02083763-bbaf-51d4-513b-0000000002ec 29922 1726853661.16292: done sending task result for task 02083763-bbaf-51d4-513b-0000000002ec 29922 1726853661.16296: WORKER PROCESS EXITING 29922 1726853661.16330: no more pending results, returning what we have 29922 1726853661.16335: in VariableManager get_vars() 29922 1726853661.16375: Calling all_inventory to load vars for managed_node3 29922 1726853661.16378: Calling groups_inventory to load vars for managed_node3 29922 1726853661.16384: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853661.16396: Calling all_plugins_play to load vars for managed_node3 29922 1726853661.16399: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853661.16401: Calling groups_plugins_play to load vars for managed_node3 29922 1726853661.16570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853661.16733: done with get_vars() 29922 1726853661.16738: variable 'ansible_search_path' from source: unknown 29922 1726853661.16739: variable 'ansible_search_path' from source: unknown 29922 1726853661.16763: we have included files to process 29922 1726853661.16764: generating all_blocks data 29922 1726853661.16765: done generating all_blocks data 29922 1726853661.16767: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 29922 1726853661.16767: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 29922 1726853661.16769: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 29922 1726853661.16934: done processing included file 29922 1726853661.16936: iterating over new_blocks loaded from include file 29922 1726853661.16937: in VariableManager get_vars() 29922 1726853661.16946: done with get_vars() 29922 1726853661.16947: filtering new block on tags 29922 1726853661.16970: done filtering new block on tags 29922 1726853661.16976: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 29922 1726853661.16985: extending task lists for all hosts with included blocks 29922 1726853661.17052: done extending task lists 29922 1726853661.17053: done processing included files 29922 1726853661.17053: results queue empty 29922 1726853661.17054: checking for any_errors_fatal 29922 1726853661.17056: done checking for any_errors_fatal 29922 1726853661.17056: checking for max_fail_percentage 29922 1726853661.17059: done checking for max_fail_percentage 29922 1726853661.17059: checking to see if all hosts have failed and the running result is not ok 29922 1726853661.17060: done checking to see if all hosts have failed 29922 1726853661.17060: getting the remaining hosts for this loop 29922 1726853661.17061: done getting the remaining hosts for this loop 29922 1726853661.17063: getting the next task for host managed_node3 29922 1726853661.17065: done getting next task for host managed_node3 29922 1726853661.17067: ^ task is: TASK: Get stat for interface {{ interface }} 29922 1726853661.17068: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853661.17070: getting variables 29922 1726853661.17072: in VariableManager get_vars() 29922 1726853661.17079: Calling all_inventory to load vars for managed_node3 29922 1726853661.17081: Calling groups_inventory to load vars for managed_node3 29922 1726853661.17082: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853661.17085: Calling all_plugins_play to load vars for managed_node3 29922 1726853661.17086: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853661.17088: Calling groups_plugins_play to load vars for managed_node3 29922 1726853661.17212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853661.17330: done with get_vars() 29922 1726853661.17336: done getting variables 29922 1726853661.17442: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:34:21 -0400 (0:00:00.022) 0:00:10.104 ****** 29922 1726853661.17466: entering _queue_task() for managed_node3/stat 29922 1726853661.17648: worker is 1 (out of 1 available) 29922 1726853661.17663: exiting _queue_task() for managed_node3/stat 29922 1726853661.17677: done queuing things up, now waiting for results queue to drain 29922 1726853661.17678: waiting for pending results... 29922 1726853661.17828: running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest0 29922 1726853661.17901: in run() - task 02083763-bbaf-51d4-513b-0000000003b5 29922 1726853661.17918: variable 'ansible_search_path' from source: unknown 29922 1726853661.17922: variable 'ansible_search_path' from source: unknown 29922 1726853661.17944: calling self._execute() 29922 1726853661.18006: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.18010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.18026: variable 'omit' from source: magic vars 29922 1726853661.18279: variable 'ansible_distribution_major_version' from source: facts 29922 1726853661.18292: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853661.18298: variable 'omit' from source: magic vars 29922 1726853661.18324: variable 'omit' from source: magic vars 29922 1726853661.18394: variable 'interface' from source: set_fact 29922 1726853661.18408: variable 'omit' from source: magic vars 29922 1726853661.18438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853661.18467: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853661.18484: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853661.18502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853661.18576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853661.18579: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853661.18582: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.18584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.18644: Set connection var ansible_connection to ssh 29922 1726853661.18651: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853661.18660: Set connection var ansible_shell_executable to /bin/sh 29922 1726853661.18665: Set connection var ansible_pipelining to False 29922 1726853661.18670: Set connection var ansible_timeout to 10 29922 1726853661.18676: Set connection var ansible_shell_type to sh 29922 1726853661.18784: variable 'ansible_shell_executable' from source: unknown 29922 1726853661.18788: variable 'ansible_connection' from source: unknown 29922 1726853661.18790: variable 'ansible_module_compression' from source: unknown 29922 1726853661.18792: variable 'ansible_shell_type' from source: unknown 29922 1726853661.18794: variable 'ansible_shell_executable' from source: unknown 29922 1726853661.18796: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.18798: variable 'ansible_pipelining' from source: unknown 29922 1726853661.18802: variable 'ansible_timeout' from source: unknown 29922 1726853661.18804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.18884: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853661.18892: variable 'omit' from source: magic vars 29922 1726853661.18897: starting attempt loop 29922 1726853661.18905: running the handler 29922 1726853661.18920: _low_level_execute_command(): starting 29922 1726853661.18923: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853661.19458: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853661.19462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853661.19465: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853661.19467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853661.19520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853661.19524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853661.19530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853661.19595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853661.21300: stdout chunk (state=3): >>>/root <<< 29922 1726853661.21409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853661.21437: stderr chunk (state=3): >>><<< 29922 1726853661.21458: stdout chunk (state=3): >>><<< 29922 1726853661.21497: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853661.21521: _low_level_execute_command(): starting 29922 1726853661.21525: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853661.2150564-30437-275762883095087 `" && echo ansible-tmp-1726853661.2150564-30437-275762883095087="` echo /root/.ansible/tmp/ansible-tmp-1726853661.2150564-30437-275762883095087 `" ) && sleep 0' 29922 1726853661.22061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 29922 1726853661.22076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853661.22120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853661.22125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853661.22195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853661.24313: stdout chunk (state=3): >>>ansible-tmp-1726853661.2150564-30437-275762883095087=/root/.ansible/tmp/ansible-tmp-1726853661.2150564-30437-275762883095087 <<< 29922 1726853661.24319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853661.24322: stdout chunk (state=3): >>><<< 29922 1726853661.24324: stderr chunk (state=3): >>><<< 29922 1726853661.24326: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853661.2150564-30437-275762883095087=/root/.ansible/tmp/ansible-tmp-1726853661.2150564-30437-275762883095087 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853661.24421: variable 'ansible_module_compression' from source: unknown 29922 1726853661.24531: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 29922 1726853661.24534: variable 'ansible_facts' from source: unknown 29922 1726853661.24696: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853661.2150564-30437-275762883095087/AnsiballZ_stat.py 29922 1726853661.24795: Sending initial data 29922 1726853661.24799: Sent initial data (153 bytes) 29922 1726853661.25235: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853661.25239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853661.25241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29922 1726853661.25243: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853661.25245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853661.25306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853661.25317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853661.25319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853661.25372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853661.26959: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 29922 1726853661.26963: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853661.27014: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853661.27073: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpxut6atzf /root/.ansible/tmp/ansible-tmp-1726853661.2150564-30437-275762883095087/AnsiballZ_stat.py <<< 29922 1726853661.27078: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853661.2150564-30437-275762883095087/AnsiballZ_stat.py" <<< 29922 1726853661.27132: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpxut6atzf" to remote "/root/.ansible/tmp/ansible-tmp-1726853661.2150564-30437-275762883095087/AnsiballZ_stat.py" <<< 29922 1726853661.27135: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853661.2150564-30437-275762883095087/AnsiballZ_stat.py" <<< 29922 1726853661.27892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853661.27895: stdout chunk (state=3): >>><<< 29922 1726853661.27897: stderr chunk (state=3): >>><<< 29922 1726853661.27899: done transferring module to remote 29922 1726853661.27909: _low_level_execute_command(): starting 29922 1726853661.27928: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853661.2150564-30437-275762883095087/ /root/.ansible/tmp/ansible-tmp-1726853661.2150564-30437-275762883095087/AnsiballZ_stat.py && sleep 0' 29922 1726853661.28720: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853661.28734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853661.28788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853661.28866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853661.28900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853661.28993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853661.30895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853661.30919: stdout chunk (state=3): >>><<< 29922 1726853661.30922: stderr chunk (state=3): >>><<< 29922 1726853661.31011: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853661.31015: _low_level_execute_command(): starting 29922 1726853661.31017: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853661.2150564-30437-275762883095087/AnsiballZ_stat.py && sleep 0' 29922 1726853661.31552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853661.31580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853661.31584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853661.31586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853661.31609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853661.31612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853661.31660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853661.31664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853661.31739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853661.47246: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31868, "dev": 23, "nlink": 1, "atime": 1726853659.8313646, "mtime": 1726853659.8313646, "ctime": 1726853659.8313646, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 29922 1726853661.48633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853661.48660: stderr chunk (state=3): >>><<< 29922 1726853661.48663: stdout chunk (state=3): >>><<< 29922 1726853661.48685: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31868, "dev": 23, "nlink": 1, "atime": 1726853659.8313646, "mtime": 1726853659.8313646, "ctime": 1726853659.8313646, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853661.48726: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853661.2150564-30437-275762883095087/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853661.48733: _low_level_execute_command(): starting 29922 1726853661.48738: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853661.2150564-30437-275762883095087/ > /dev/null 2>&1 && sleep 0' 29922 1726853661.49250: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853661.49253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853661.49256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853661.49258: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853661.49260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853661.49326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853661.49329: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853661.49399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853661.51261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853661.51287: stderr chunk (state=3): >>><<< 29922 1726853661.51290: stdout chunk (state=3): >>><<< 29922 1726853661.51304: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853661.51310: handler run complete 29922 1726853661.51345: attempt loop complete, returning result 29922 1726853661.51349: _execute() done 29922 1726853661.51351: dumping result to json 29922 1726853661.51358: done dumping result, returning 29922 1726853661.51368: done running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest0 [02083763-bbaf-51d4-513b-0000000003b5] 29922 1726853661.51372: sending task result for task 02083763-bbaf-51d4-513b-0000000003b5 29922 1726853661.51475: done sending task result for task 02083763-bbaf-51d4-513b-0000000003b5 29922 1726853661.51478: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726853659.8313646, "block_size": 4096, "blocks": 0, "ctime": 1726853659.8313646, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31868, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1726853659.8313646, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 29922 1726853661.51563: no more pending results, returning what we have 29922 1726853661.51567: results queue empty 29922 1726853661.51568: checking for any_errors_fatal 29922 1726853661.51569: done checking for any_errors_fatal 29922 1726853661.51570: checking for max_fail_percentage 29922 1726853661.51573: done checking for max_fail_percentage 29922 1726853661.51574: checking to see if all hosts have failed and the running result is not ok 29922 1726853661.51575: done checking to see if all hosts have failed 29922 1726853661.51575: getting the remaining hosts for this loop 29922 1726853661.51577: done getting the remaining hosts for this loop 29922 1726853661.51580: getting the next task for host managed_node3 29922 1726853661.51588: done getting next task for host managed_node3 29922 1726853661.51590: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 29922 1726853661.51593: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853661.51597: getting variables 29922 1726853661.51598: in VariableManager get_vars() 29922 1726853661.51629: Calling all_inventory to load vars for managed_node3 29922 1726853661.51631: Calling groups_inventory to load vars for managed_node3 29922 1726853661.51633: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853661.51642: Calling all_plugins_play to load vars for managed_node3 29922 1726853661.51644: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853661.51646: Calling groups_plugins_play to load vars for managed_node3 29922 1726853661.51796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853661.51947: done with get_vars() 29922 1726853661.51954: done getting variables 29922 1726853661.52029: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 29922 1726853661.52117: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:34:21 -0400 (0:00:00.346) 0:00:10.450 ****** 29922 1726853661.52139: entering _queue_task() for managed_node3/assert 29922 1726853661.52140: Creating lock for assert 29922 1726853661.52353: worker is 1 (out of 1 available) 29922 1726853661.52367: exiting _queue_task() for managed_node3/assert 29922 1726853661.52383: done queuing things up, now waiting for results queue to drain 29922 1726853661.52385: waiting for pending results... 29922 1726853661.52546: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'ethtest0' 29922 1726853661.52614: in run() - task 02083763-bbaf-51d4-513b-0000000002ed 29922 1726853661.52627: variable 'ansible_search_path' from source: unknown 29922 1726853661.52630: variable 'ansible_search_path' from source: unknown 29922 1726853661.52655: calling self._execute() 29922 1726853661.52720: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.52726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.52737: variable 'omit' from source: magic vars 29922 1726853661.53002: variable 'ansible_distribution_major_version' from source: facts 29922 1726853661.53013: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853661.53019: variable 'omit' from source: magic vars 29922 1726853661.53044: variable 'omit' from source: magic vars 29922 1726853661.53115: variable 'interface' from source: set_fact 29922 1726853661.53128: variable 'omit' from source: magic vars 29922 1726853661.53161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853661.53193: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853661.53208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853661.53221: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853661.53230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853661.53253: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853661.53256: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.53264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.53329: Set connection var ansible_connection to ssh 29922 1726853661.53335: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853661.53346: Set connection var ansible_shell_executable to /bin/sh 29922 1726853661.53362: Set connection var ansible_pipelining to False 29922 1726853661.53367: Set connection var ansible_timeout to 10 29922 1726853661.53369: Set connection var ansible_shell_type to sh 29922 1726853661.53475: variable 'ansible_shell_executable' from source: unknown 29922 1726853661.53479: variable 'ansible_connection' from source: unknown 29922 1726853661.53482: variable 'ansible_module_compression' from source: unknown 29922 1726853661.53484: variable 'ansible_shell_type' from source: unknown 29922 1726853661.53485: variable 'ansible_shell_executable' from source: unknown 29922 1726853661.53487: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.53489: variable 'ansible_pipelining' from source: unknown 29922 1726853661.53491: variable 'ansible_timeout' from source: unknown 29922 1726853661.53494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.53584: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853661.53776: variable 'omit' from source: magic vars 29922 1726853661.53779: starting attempt loop 29922 1726853661.53782: running the handler 29922 1726853661.53783: variable 'interface_stat' from source: set_fact 29922 1726853661.53785: Evaluated conditional (interface_stat.stat.exists): True 29922 1726853661.53787: handler run complete 29922 1726853661.53825: attempt loop complete, returning result 29922 1726853661.53832: _execute() done 29922 1726853661.53840: dumping result to json 29922 1726853661.53847: done dumping result, returning 29922 1726853661.53857: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'ethtest0' [02083763-bbaf-51d4-513b-0000000002ed] 29922 1726853661.53866: sending task result for task 02083763-bbaf-51d4-513b-0000000002ed ok: [managed_node3] => { "changed": false } MSG: All assertions passed 29922 1726853661.54024: no more pending results, returning what we have 29922 1726853661.54029: results queue empty 29922 1726853661.54030: checking for any_errors_fatal 29922 1726853661.54041: done checking for any_errors_fatal 29922 1726853661.54042: checking for max_fail_percentage 29922 1726853661.54043: done checking for max_fail_percentage 29922 1726853661.54044: checking to see if all hosts have failed and the running result is not ok 29922 1726853661.54045: done checking to see if all hosts have failed 29922 1726853661.54046: getting the remaining hosts for this loop 29922 1726853661.54047: done getting the remaining hosts for this loop 29922 1726853661.54051: getting the next task for host managed_node3 29922 1726853661.54058: done getting next task for host managed_node3 29922 1726853661.54061: ^ task is: TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 29922 1726853661.54063: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853661.54067: getting variables 29922 1726853661.54068: in VariableManager get_vars() 29922 1726853661.54107: Calling all_inventory to load vars for managed_node3 29922 1726853661.54110: Calling groups_inventory to load vars for managed_node3 29922 1726853661.54113: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853661.54127: Calling all_plugins_play to load vars for managed_node3 29922 1726853661.54131: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853661.54134: Calling groups_plugins_play to load vars for managed_node3 29922 1726853661.54390: done sending task result for task 02083763-bbaf-51d4-513b-0000000002ed 29922 1726853661.54393: WORKER PROCESS EXITING 29922 1726853661.54425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853661.54601: done with get_vars() 29922 1726853661.54609: done getting variables TASK [Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:23 Friday 20 September 2024 13:34:21 -0400 (0:00:00.025) 0:00:10.476 ****** 29922 1726853661.54678: entering _queue_task() for managed_node3/lineinfile 29922 1726853661.54680: Creating lock for lineinfile 29922 1726853661.54877: worker is 1 (out of 1 available) 29922 1726853661.54889: exiting _queue_task() for managed_node3/lineinfile 29922 1726853661.54901: done queuing things up, now waiting for results queue to drain 29922 1726853661.54903: waiting for pending results... 29922 1726853661.55070: running TaskExecutor() for managed_node3/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 29922 1726853661.55133: in run() - task 02083763-bbaf-51d4-513b-00000000000f 29922 1726853661.55140: variable 'ansible_search_path' from source: unknown 29922 1726853661.55176: calling self._execute() 29922 1726853661.55244: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.55250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.55262: variable 'omit' from source: magic vars 29922 1726853661.55518: variable 'ansible_distribution_major_version' from source: facts 29922 1726853661.55527: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853661.55533: variable 'omit' from source: magic vars 29922 1726853661.55546: variable 'omit' from source: magic vars 29922 1726853661.55576: variable 'omit' from source: magic vars 29922 1726853661.55608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853661.55633: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853661.55648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853661.55663: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853661.55677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853661.55700: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853661.55703: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.55705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.55767: Set connection var ansible_connection to ssh 29922 1726853661.55775: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853661.55785: Set connection var ansible_shell_executable to /bin/sh 29922 1726853661.55790: Set connection var ansible_pipelining to False 29922 1726853661.55795: Set connection var ansible_timeout to 10 29922 1726853661.55799: Set connection var ansible_shell_type to sh 29922 1726853661.55818: variable 'ansible_shell_executable' from source: unknown 29922 1726853661.55821: variable 'ansible_connection' from source: unknown 29922 1726853661.55823: variable 'ansible_module_compression' from source: unknown 29922 1726853661.55826: variable 'ansible_shell_type' from source: unknown 29922 1726853661.55829: variable 'ansible_shell_executable' from source: unknown 29922 1726853661.55831: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853661.55833: variable 'ansible_pipelining' from source: unknown 29922 1726853661.55836: variable 'ansible_timeout' from source: unknown 29922 1726853661.55839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853661.55982: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853661.55990: variable 'omit' from source: magic vars 29922 1726853661.56001: starting attempt loop 29922 1726853661.56004: running the handler 29922 1726853661.56011: _low_level_execute_command(): starting 29922 1726853661.56020: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853661.56522: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853661.56526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29922 1726853661.56529: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853661.56532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853661.56576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853661.56582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853661.56656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853661.58354: stdout chunk (state=3): >>>/root <<< 29922 1726853661.58476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853661.58509: stderr chunk (state=3): >>><<< 29922 1726853661.58512: stdout chunk (state=3): >>><<< 29922 1726853661.58567: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853661.58574: _low_level_execute_command(): starting 29922 1726853661.58577: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853661.5854666-30454-132438859919859 `" && echo ansible-tmp-1726853661.5854666-30454-132438859919859="` echo /root/.ansible/tmp/ansible-tmp-1726853661.5854666-30454-132438859919859 `" ) && sleep 0' 29922 1726853661.59217: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853661.59222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853661.59227: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853661.59280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853661.59304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853661.59405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853661.61369: stdout chunk (state=3): >>>ansible-tmp-1726853661.5854666-30454-132438859919859=/root/.ansible/tmp/ansible-tmp-1726853661.5854666-30454-132438859919859 <<< 29922 1726853661.61599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853661.61664: stderr chunk (state=3): >>><<< 29922 1726853661.61716: stdout chunk (state=3): >>><<< 29922 1726853661.61798: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853661.5854666-30454-132438859919859=/root/.ansible/tmp/ansible-tmp-1726853661.5854666-30454-132438859919859 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853661.62002: variable 'ansible_module_compression' from source: unknown 29922 1726853661.62006: ANSIBALLZ: Using lock for lineinfile 29922 1726853661.62008: ANSIBALLZ: Acquiring lock 29922 1726853661.62010: ANSIBALLZ: Lock acquired: 140376039144192 29922 1726853661.62012: ANSIBALLZ: Creating module 29922 1726853661.77801: ANSIBALLZ: Writing module into payload 29922 1726853661.77913: ANSIBALLZ: Writing module 29922 1726853661.77947: ANSIBALLZ: Renaming module 29922 1726853661.77950: ANSIBALLZ: Done creating module 29922 1726853661.78064: variable 'ansible_facts' from source: unknown 29922 1726853661.78068: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853661.5854666-30454-132438859919859/AnsiballZ_lineinfile.py 29922 1726853661.78310: Sending initial data 29922 1726853661.78313: Sent initial data (159 bytes) 29922 1726853661.78996: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853661.79050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853661.79064: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853661.79086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853661.79176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853661.80896: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853661.80947: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853661.81026: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp53xx2add /root/.ansible/tmp/ansible-tmp-1726853661.5854666-30454-132438859919859/AnsiballZ_lineinfile.py <<< 29922 1726853661.81029: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853661.5854666-30454-132438859919859/AnsiballZ_lineinfile.py" <<< 29922 1726853661.81096: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp53xx2add" to remote "/root/.ansible/tmp/ansible-tmp-1726853661.5854666-30454-132438859919859/AnsiballZ_lineinfile.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853661.5854666-30454-132438859919859/AnsiballZ_lineinfile.py" <<< 29922 1726853661.82106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853661.82141: stderr chunk (state=3): >>><<< 29922 1726853661.82295: stdout chunk (state=3): >>><<< 29922 1726853661.82299: done transferring module to remote 29922 1726853661.82301: _low_level_execute_command(): starting 29922 1726853661.82303: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853661.5854666-30454-132438859919859/ /root/.ansible/tmp/ansible-tmp-1726853661.5854666-30454-132438859919859/AnsiballZ_lineinfile.py && sleep 0' 29922 1726853661.83090: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853661.83108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853661.83285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853661.83327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853661.83399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853661.83489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853661.85491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853661.85528: stderr chunk (state=3): >>><<< 29922 1726853661.85531: stdout chunk (state=3): >>><<< 29922 1726853661.85581: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853661.85585: _low_level_execute_command(): starting 29922 1726853661.85587: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853661.5854666-30454-132438859919859/AnsiballZ_lineinfile.py && sleep 0' 29922 1726853661.86285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853661.86389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853661.86406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853661.86419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853661.86513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853662.03129: stdout chunk (state=3): >>> {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 29922 1726853662.04527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853662.04688: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 29922 1726853662.04692: stdout chunk (state=3): >>><<< 29922 1726853662.04699: stderr chunk (state=3): >>><<< 29922 1726853662.04726: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853662.04794: done with _execute_module (lineinfile, {'path': '/etc/iproute2/rt_tables.d/table.conf', 'line': '200 custom', 'mode': '0644', 'create': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'lineinfile', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853661.5854666-30454-132438859919859/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853662.04798: _low_level_execute_command(): starting 29922 1726853662.04800: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853661.5854666-30454-132438859919859/ > /dev/null 2>&1 && sleep 0' 29922 1726853662.06050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853662.06054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853662.06224: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29922 1726853662.06232: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853662.06237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853662.06244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853662.06247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853662.06279: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853662.06491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853662.08389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853662.08741: stderr chunk (state=3): >>><<< 29922 1726853662.08745: stdout chunk (state=3): >>><<< 29922 1726853662.08748: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853662.08750: handler run complete 29922 1726853662.08752: attempt loop complete, returning result 29922 1726853662.08754: _execute() done 29922 1726853662.08756: dumping result to json 29922 1726853662.08758: done dumping result, returning 29922 1726853662.08760: done running TaskExecutor() for managed_node3/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table [02083763-bbaf-51d4-513b-00000000000f] 29922 1726853662.08762: sending task result for task 02083763-bbaf-51d4-513b-00000000000f changed: [managed_node3] => { "backup": "", "changed": true } MSG: line added 29922 1726853662.08943: no more pending results, returning what we have 29922 1726853662.08947: results queue empty 29922 1726853662.08948: checking for any_errors_fatal 29922 1726853662.08957: done checking for any_errors_fatal 29922 1726853662.08958: checking for max_fail_percentage 29922 1726853662.08960: done checking for max_fail_percentage 29922 1726853662.08961: checking to see if all hosts have failed and the running result is not ok 29922 1726853662.08961: done checking to see if all hosts have failed 29922 1726853662.08962: getting the remaining hosts for this loop 29922 1726853662.08964: done getting the remaining hosts for this loop 29922 1726853662.08968: getting the next task for host managed_node3 29922 1726853662.08977: done getting next task for host managed_node3 29922 1726853662.08983: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 29922 1726853662.08985: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853662.09003: getting variables 29922 1726853662.09004: in VariableManager get_vars() 29922 1726853662.09047: Calling all_inventory to load vars for managed_node3 29922 1726853662.09050: Calling groups_inventory to load vars for managed_node3 29922 1726853662.09052: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853662.09410: Calling all_plugins_play to load vars for managed_node3 29922 1726853662.09414: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853662.09419: Calling groups_plugins_play to load vars for managed_node3 29922 1726853662.10091: done sending task result for task 02083763-bbaf-51d4-513b-00000000000f 29922 1726853662.10094: WORKER PROCESS EXITING 29922 1726853662.10123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853662.10462: done with get_vars() 29922 1726853662.10576: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:34:22 -0400 (0:00:00.561) 0:00:11.037 ****** 29922 1726853662.10789: entering _queue_task() for managed_node3/include_tasks 29922 1726853662.11350: worker is 1 (out of 1 available) 29922 1726853662.11363: exiting _queue_task() for managed_node3/include_tasks 29922 1726853662.11480: done queuing things up, now waiting for results queue to drain 29922 1726853662.11482: waiting for pending results... 29922 1726853662.11838: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 29922 1726853662.11934: in run() - task 02083763-bbaf-51d4-513b-000000000017 29922 1726853662.11948: variable 'ansible_search_path' from source: unknown 29922 1726853662.11956: variable 'ansible_search_path' from source: unknown 29922 1726853662.12174: calling self._execute() 29922 1726853662.12376: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853662.12381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853662.12384: variable 'omit' from source: magic vars 29922 1726853662.12682: variable 'ansible_distribution_major_version' from source: facts 29922 1726853662.12700: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853662.12715: _execute() done 29922 1726853662.12723: dumping result to json 29922 1726853662.12731: done dumping result, returning 29922 1726853662.12764: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-51d4-513b-000000000017] 29922 1726853662.12776: sending task result for task 02083763-bbaf-51d4-513b-000000000017 29922 1726853662.12915: no more pending results, returning what we have 29922 1726853662.12920: in VariableManager get_vars() 29922 1726853662.12961: Calling all_inventory to load vars for managed_node3 29922 1726853662.12963: Calling groups_inventory to load vars for managed_node3 29922 1726853662.12965: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853662.12979: Calling all_plugins_play to load vars for managed_node3 29922 1726853662.12981: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853662.12984: Calling groups_plugins_play to load vars for managed_node3 29922 1726853662.13423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853662.13713: done with get_vars() 29922 1726853662.13836: variable 'ansible_search_path' from source: unknown 29922 1726853662.13838: variable 'ansible_search_path' from source: unknown 29922 1726853662.13976: we have included files to process 29922 1726853662.13977: generating all_blocks data 29922 1726853662.13979: done generating all_blocks data 29922 1726853662.13984: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29922 1726853662.13985: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29922 1726853662.13988: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29922 1726853662.14598: done sending task result for task 02083763-bbaf-51d4-513b-000000000017 29922 1726853662.14601: WORKER PROCESS EXITING 29922 1726853662.15326: done processing included file 29922 1726853662.15328: iterating over new_blocks loaded from include file 29922 1726853662.15330: in VariableManager get_vars() 29922 1726853662.15359: done with get_vars() 29922 1726853662.15361: filtering new block on tags 29922 1726853662.15380: done filtering new block on tags 29922 1726853662.15383: in VariableManager get_vars() 29922 1726853662.15404: done with get_vars() 29922 1726853662.15406: filtering new block on tags 29922 1726853662.15447: done filtering new block on tags 29922 1726853662.15450: in VariableManager get_vars() 29922 1726853662.15478: done with get_vars() 29922 1726853662.15479: filtering new block on tags 29922 1726853662.15497: done filtering new block on tags 29922 1726853662.15499: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 29922 1726853662.15504: extending task lists for all hosts with included blocks 29922 1726853662.16304: done extending task lists 29922 1726853662.16306: done processing included files 29922 1726853662.16307: results queue empty 29922 1726853662.16307: checking for any_errors_fatal 29922 1726853662.16312: done checking for any_errors_fatal 29922 1726853662.16313: checking for max_fail_percentage 29922 1726853662.16314: done checking for max_fail_percentage 29922 1726853662.16315: checking to see if all hosts have failed and the running result is not ok 29922 1726853662.16316: done checking to see if all hosts have failed 29922 1726853662.16316: getting the remaining hosts for this loop 29922 1726853662.16318: done getting the remaining hosts for this loop 29922 1726853662.16321: getting the next task for host managed_node3 29922 1726853662.16324: done getting next task for host managed_node3 29922 1726853662.16327: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 29922 1726853662.16444: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853662.16454: getting variables 29922 1726853662.16455: in VariableManager get_vars() 29922 1726853662.16469: Calling all_inventory to load vars for managed_node3 29922 1726853662.16474: Calling groups_inventory to load vars for managed_node3 29922 1726853662.16476: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853662.16482: Calling all_plugins_play to load vars for managed_node3 29922 1726853662.16484: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853662.16487: Calling groups_plugins_play to load vars for managed_node3 29922 1726853662.16751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853662.17420: done with get_vars() 29922 1726853662.17429: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:34:22 -0400 (0:00:00.068) 0:00:11.105 ****** 29922 1726853662.17633: entering _queue_task() for managed_node3/setup 29922 1726853662.18176: worker is 1 (out of 1 available) 29922 1726853662.18283: exiting _queue_task() for managed_node3/setup 29922 1726853662.18295: done queuing things up, now waiting for results queue to drain 29922 1726853662.18366: waiting for pending results... 29922 1726853662.18916: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 29922 1726853662.19098: in run() - task 02083763-bbaf-51d4-513b-0000000003d0 29922 1726853662.19120: variable 'ansible_search_path' from source: unknown 29922 1726853662.19127: variable 'ansible_search_path' from source: unknown 29922 1726853662.19162: calling self._execute() 29922 1726853662.19253: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853662.19266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853662.19284: variable 'omit' from source: magic vars 29922 1726853662.19641: variable 'ansible_distribution_major_version' from source: facts 29922 1726853662.19658: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853662.19880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853662.24369: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853662.24437: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853662.24482: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853662.24526: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853662.24556: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853662.24639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853662.24669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853662.24706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853662.24751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853662.24769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853662.24831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853662.24855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853662.24883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853662.25149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853662.25152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853662.25307: variable '__network_required_facts' from source: role '' defaults 29922 1726853662.25382: variable 'ansible_facts' from source: unknown 29922 1726853662.25603: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 29922 1726853662.25611: when evaluation is False, skipping this task 29922 1726853662.25618: _execute() done 29922 1726853662.25624: dumping result to json 29922 1726853662.25692: done dumping result, returning 29922 1726853662.25696: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-51d4-513b-0000000003d0] 29922 1726853662.25699: sending task result for task 02083763-bbaf-51d4-513b-0000000003d0 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29922 1726853662.25934: no more pending results, returning what we have 29922 1726853662.25938: results queue empty 29922 1726853662.25939: checking for any_errors_fatal 29922 1726853662.25941: done checking for any_errors_fatal 29922 1726853662.25941: checking for max_fail_percentage 29922 1726853662.25943: done checking for max_fail_percentage 29922 1726853662.25943: checking to see if all hosts have failed and the running result is not ok 29922 1726853662.25944: done checking to see if all hosts have failed 29922 1726853662.25945: getting the remaining hosts for this loop 29922 1726853662.25946: done getting the remaining hosts for this loop 29922 1726853662.25949: getting the next task for host managed_node3 29922 1726853662.25960: done getting next task for host managed_node3 29922 1726853662.25963: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 29922 1726853662.25967: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853662.25983: getting variables 29922 1726853662.25985: in VariableManager get_vars() 29922 1726853662.26025: Calling all_inventory to load vars for managed_node3 29922 1726853662.26028: Calling groups_inventory to load vars for managed_node3 29922 1726853662.26031: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853662.26041: Calling all_plugins_play to load vars for managed_node3 29922 1726853662.26044: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853662.26048: Calling groups_plugins_play to load vars for managed_node3 29922 1726853662.26520: done sending task result for task 02083763-bbaf-51d4-513b-0000000003d0 29922 1726853662.26524: WORKER PROCESS EXITING 29922 1726853662.26548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853662.27766: done with get_vars() 29922 1726853662.27777: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:34:22 -0400 (0:00:00.102) 0:00:11.208 ****** 29922 1726853662.27877: entering _queue_task() for managed_node3/stat 29922 1726853662.28422: worker is 1 (out of 1 available) 29922 1726853662.28435: exiting _queue_task() for managed_node3/stat 29922 1726853662.28447: done queuing things up, now waiting for results queue to drain 29922 1726853662.28448: waiting for pending results... 29922 1726853662.28800: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 29922 1726853662.28978: in run() - task 02083763-bbaf-51d4-513b-0000000003d2 29922 1726853662.28997: variable 'ansible_search_path' from source: unknown 29922 1726853662.29005: variable 'ansible_search_path' from source: unknown 29922 1726853662.29049: calling self._execute() 29922 1726853662.29144: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853662.29167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853662.29266: variable 'omit' from source: magic vars 29922 1726853662.29598: variable 'ansible_distribution_major_version' from source: facts 29922 1726853662.29616: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853662.29798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853662.30134: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853662.30161: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853662.30196: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853662.30228: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853662.30322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853662.30377: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853662.30393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853662.30419: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853662.30569: variable '__network_is_ostree' from source: set_fact 29922 1726853662.30573: Evaluated conditional (not __network_is_ostree is defined): False 29922 1726853662.30576: when evaluation is False, skipping this task 29922 1726853662.30578: _execute() done 29922 1726853662.30580: dumping result to json 29922 1726853662.30582: done dumping result, returning 29922 1726853662.30584: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-51d4-513b-0000000003d2] 29922 1726853662.30586: sending task result for task 02083763-bbaf-51d4-513b-0000000003d2 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 29922 1726853662.30724: no more pending results, returning what we have 29922 1726853662.30729: results queue empty 29922 1726853662.30730: checking for any_errors_fatal 29922 1726853662.30737: done checking for any_errors_fatal 29922 1726853662.30738: checking for max_fail_percentage 29922 1726853662.30740: done checking for max_fail_percentage 29922 1726853662.30741: checking to see if all hosts have failed and the running result is not ok 29922 1726853662.30742: done checking to see if all hosts have failed 29922 1726853662.30743: getting the remaining hosts for this loop 29922 1726853662.30745: done getting the remaining hosts for this loop 29922 1726853662.30749: getting the next task for host managed_node3 29922 1726853662.30757: done getting next task for host managed_node3 29922 1726853662.30765: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 29922 1726853662.30770: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853662.30787: getting variables 29922 1726853662.30789: in VariableManager get_vars() 29922 1726853662.30826: Calling all_inventory to load vars for managed_node3 29922 1726853662.30829: Calling groups_inventory to load vars for managed_node3 29922 1726853662.30831: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853662.30840: Calling all_plugins_play to load vars for managed_node3 29922 1726853662.30842: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853662.30845: Calling groups_plugins_play to load vars for managed_node3 29922 1726853662.31620: done sending task result for task 02083763-bbaf-51d4-513b-0000000003d2 29922 1726853662.31624: WORKER PROCESS EXITING 29922 1726853662.31895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853662.32329: done with get_vars() 29922 1726853662.32341: done getting variables 29922 1726853662.32405: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:34:22 -0400 (0:00:00.045) 0:00:11.253 ****** 29922 1726853662.32441: entering _queue_task() for managed_node3/set_fact 29922 1726853662.32932: worker is 1 (out of 1 available) 29922 1726853662.32943: exiting _queue_task() for managed_node3/set_fact 29922 1726853662.32955: done queuing things up, now waiting for results queue to drain 29922 1726853662.32956: waiting for pending results... 29922 1726853662.33263: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 29922 1726853662.33437: in run() - task 02083763-bbaf-51d4-513b-0000000003d3 29922 1726853662.33461: variable 'ansible_search_path' from source: unknown 29922 1726853662.33470: variable 'ansible_search_path' from source: unknown 29922 1726853662.33519: calling self._execute() 29922 1726853662.33609: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853662.33628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853662.33644: variable 'omit' from source: magic vars 29922 1726853662.34054: variable 'ansible_distribution_major_version' from source: facts 29922 1726853662.34079: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853662.34252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853662.34645: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853662.34699: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853662.34742: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853662.34785: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853662.34928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853662.34931: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853662.34950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853662.34986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853662.35084: variable '__network_is_ostree' from source: set_fact 29922 1726853662.35096: Evaluated conditional (not __network_is_ostree is defined): False 29922 1726853662.35106: when evaluation is False, skipping this task 29922 1726853662.35143: _execute() done 29922 1726853662.35146: dumping result to json 29922 1726853662.35148: done dumping result, returning 29922 1726853662.35151: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-51d4-513b-0000000003d3] 29922 1726853662.35154: sending task result for task 02083763-bbaf-51d4-513b-0000000003d3 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 29922 1726853662.35423: no more pending results, returning what we have 29922 1726853662.35428: results queue empty 29922 1726853662.35429: checking for any_errors_fatal 29922 1726853662.35436: done checking for any_errors_fatal 29922 1726853662.35436: checking for max_fail_percentage 29922 1726853662.35438: done checking for max_fail_percentage 29922 1726853662.35439: checking to see if all hosts have failed and the running result is not ok 29922 1726853662.35440: done checking to see if all hosts have failed 29922 1726853662.35441: getting the remaining hosts for this loop 29922 1726853662.35442: done getting the remaining hosts for this loop 29922 1726853662.35446: getting the next task for host managed_node3 29922 1726853662.35456: done getting next task for host managed_node3 29922 1726853662.35462: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 29922 1726853662.35467: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853662.35483: getting variables 29922 1726853662.35485: in VariableManager get_vars() 29922 1726853662.35524: Calling all_inventory to load vars for managed_node3 29922 1726853662.35527: Calling groups_inventory to load vars for managed_node3 29922 1726853662.35530: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853662.35541: Calling all_plugins_play to load vars for managed_node3 29922 1726853662.35544: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853662.35548: Calling groups_plugins_play to load vars for managed_node3 29922 1726853662.36085: done sending task result for task 02083763-bbaf-51d4-513b-0000000003d3 29922 1726853662.36088: WORKER PROCESS EXITING 29922 1726853662.36419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853662.36850: done with get_vars() 29922 1726853662.36862: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:34:22 -0400 (0:00:00.045) 0:00:11.299 ****** 29922 1726853662.36946: entering _queue_task() for managed_node3/service_facts 29922 1726853662.36948: Creating lock for service_facts 29922 1726853662.37544: worker is 1 (out of 1 available) 29922 1726853662.37561: exiting _queue_task() for managed_node3/service_facts 29922 1726853662.37676: done queuing things up, now waiting for results queue to drain 29922 1726853662.37678: waiting for pending results... 29922 1726853662.38068: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 29922 1726853662.38533: in run() - task 02083763-bbaf-51d4-513b-0000000003d5 29922 1726853662.38554: variable 'ansible_search_path' from source: unknown 29922 1726853662.38583: variable 'ansible_search_path' from source: unknown 29922 1726853662.38978: calling self._execute() 29922 1726853662.38982: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853662.38985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853662.38988: variable 'omit' from source: magic vars 29922 1726853662.39646: variable 'ansible_distribution_major_version' from source: facts 29922 1726853662.39754: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853662.39770: variable 'omit' from source: magic vars 29922 1726853662.39846: variable 'omit' from source: magic vars 29922 1726853662.40073: variable 'omit' from source: magic vars 29922 1726853662.40077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853662.40286: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853662.40289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853662.40292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853662.40295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853662.40297: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853662.40300: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853662.40302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853662.40467: Set connection var ansible_connection to ssh 29922 1726853662.40588: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853662.40602: Set connection var ansible_shell_executable to /bin/sh 29922 1726853662.40619: Set connection var ansible_pipelining to False 29922 1726853662.40630: Set connection var ansible_timeout to 10 29922 1726853662.40638: Set connection var ansible_shell_type to sh 29922 1726853662.40670: variable 'ansible_shell_executable' from source: unknown 29922 1726853662.40726: variable 'ansible_connection' from source: unknown 29922 1726853662.40735: variable 'ansible_module_compression' from source: unknown 29922 1726853662.40742: variable 'ansible_shell_type' from source: unknown 29922 1726853662.40750: variable 'ansible_shell_executable' from source: unknown 29922 1726853662.40757: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853662.40768: variable 'ansible_pipelining' from source: unknown 29922 1726853662.40877: variable 'ansible_timeout' from source: unknown 29922 1726853662.40880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853662.41151: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853662.41578: variable 'omit' from source: magic vars 29922 1726853662.41581: starting attempt loop 29922 1726853662.41584: running the handler 29922 1726853662.41586: _low_level_execute_command(): starting 29922 1726853662.41588: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853662.42790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853662.42809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853662.42886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853662.43086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853662.43128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853662.43191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853662.45089: stdout chunk (state=3): >>>/root <<< 29922 1726853662.45106: stdout chunk (state=3): >>><<< 29922 1726853662.45114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853662.45124: stderr chunk (state=3): >>><<< 29922 1726853662.45149: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853662.45170: _low_level_execute_command(): starting 29922 1726853662.45204: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853662.4515626-30489-274757813969703 `" && echo ansible-tmp-1726853662.4515626-30489-274757813969703="` echo /root/.ansible/tmp/ansible-tmp-1726853662.4515626-30489-274757813969703 `" ) && sleep 0' 29922 1726853662.46622: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853662.46689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853662.46836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853662.46893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853662.46959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853662.49044: stdout chunk (state=3): >>>ansible-tmp-1726853662.4515626-30489-274757813969703=/root/.ansible/tmp/ansible-tmp-1726853662.4515626-30489-274757813969703 <<< 29922 1726853662.49092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853662.49125: stderr chunk (state=3): >>><<< 29922 1726853662.49143: stdout chunk (state=3): >>><<< 29922 1726853662.49168: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853662.4515626-30489-274757813969703=/root/.ansible/tmp/ansible-tmp-1726853662.4515626-30489-274757813969703 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853662.49297: variable 'ansible_module_compression' from source: unknown 29922 1726853662.49574: ANSIBALLZ: Using lock for service_facts 29922 1726853662.49578: ANSIBALLZ: Acquiring lock 29922 1726853662.49580: ANSIBALLZ: Lock acquired: 140376041245008 29922 1726853662.49583: ANSIBALLZ: Creating module 29922 1726853662.74816: ANSIBALLZ: Writing module into payload 29922 1726853662.75090: ANSIBALLZ: Writing module 29922 1726853662.75119: ANSIBALLZ: Renaming module 29922 1726853662.75130: ANSIBALLZ: Done creating module 29922 1726853662.75187: variable 'ansible_facts' from source: unknown 29922 1726853662.75388: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853662.4515626-30489-274757813969703/AnsiballZ_service_facts.py 29922 1726853662.75717: Sending initial data 29922 1726853662.75733: Sent initial data (162 bytes) 29922 1726853662.76444: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853662.76467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853662.76493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853662.76581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853662.76609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853662.76627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853662.76688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853662.76886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853662.78699: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 29922 1726853662.78716: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853662.78951: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853662.79009: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpkdv815xx /root/.ansible/tmp/ansible-tmp-1726853662.4515626-30489-274757813969703/AnsiballZ_service_facts.py <<< 29922 1726853662.79027: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853662.4515626-30489-274757813969703/AnsiballZ_service_facts.py" <<< 29922 1726853662.79094: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpkdv815xx" to remote "/root/.ansible/tmp/ansible-tmp-1726853662.4515626-30489-274757813969703/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853662.4515626-30489-274757813969703/AnsiballZ_service_facts.py" <<< 29922 1726853662.81378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853662.81384: stdout chunk (state=3): >>><<< 29922 1726853662.81387: stderr chunk (state=3): >>><<< 29922 1726853662.81403: done transferring module to remote 29922 1726853662.81419: _low_level_execute_command(): starting 29922 1726853662.81580: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853662.4515626-30489-274757813969703/ /root/.ansible/tmp/ansible-tmp-1726853662.4515626-30489-274757813969703/AnsiballZ_service_facts.py && sleep 0' 29922 1726853662.82793: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853662.82965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853662.83274: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853662.83331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853662.85293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853662.85297: stderr chunk (state=3): >>><<< 29922 1726853662.85300: stdout chunk (state=3): >>><<< 29922 1726853662.85517: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853662.85520: _low_level_execute_command(): starting 29922 1726853662.85523: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853662.4515626-30489-274757813969703/AnsiballZ_service_facts.py && sleep 0' 29922 1726853662.86808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853662.86899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853662.86941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853662.86955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853662.87201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853664.48037: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 29922 1726853664.48065: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 29922 1726853664.48092: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 29922 1726853664.48189: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 29922 1726853664.49668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853664.49685: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 29922 1726853664.49744: stderr chunk (state=3): >>><<< 29922 1726853664.49752: stdout chunk (state=3): >>><<< 29922 1726853664.49782: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853664.50429: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853662.4515626-30489-274757813969703/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853664.50443: _low_level_execute_command(): starting 29922 1726853664.50452: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853662.4515626-30489-274757813969703/ > /dev/null 2>&1 && sleep 0' 29922 1726853664.51039: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853664.51053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853664.51067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853664.51087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853664.51106: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853664.51118: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853664.51131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853664.51148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853664.51160: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853664.51180: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29922 1726853664.51194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853664.51277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853664.51304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853664.51392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853664.53346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853664.53350: stdout chunk (state=3): >>><<< 29922 1726853664.53357: stderr chunk (state=3): >>><<< 29922 1726853664.53379: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853664.53385: handler run complete 29922 1726853664.53590: variable 'ansible_facts' from source: unknown 29922 1726853664.53750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853664.55209: variable 'ansible_facts' from source: unknown 29922 1726853664.55609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853664.56282: attempt loop complete, returning result 29922 1726853664.56286: _execute() done 29922 1726853664.56288: dumping result to json 29922 1726853664.56570: done dumping result, returning 29922 1726853664.56582: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-51d4-513b-0000000003d5] 29922 1726853664.56587: sending task result for task 02083763-bbaf-51d4-513b-0000000003d5 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29922 1726853664.58722: no more pending results, returning what we have 29922 1726853664.58724: results queue empty 29922 1726853664.58725: checking for any_errors_fatal 29922 1726853664.58730: done checking for any_errors_fatal 29922 1726853664.58731: checking for max_fail_percentage 29922 1726853664.58732: done checking for max_fail_percentage 29922 1726853664.58733: checking to see if all hosts have failed and the running result is not ok 29922 1726853664.58734: done checking to see if all hosts have failed 29922 1726853664.58734: getting the remaining hosts for this loop 29922 1726853664.58736: done getting the remaining hosts for this loop 29922 1726853664.58739: getting the next task for host managed_node3 29922 1726853664.58744: done getting next task for host managed_node3 29922 1726853664.58747: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 29922 1726853664.58751: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853664.58759: getting variables 29922 1726853664.58761: in VariableManager get_vars() 29922 1726853664.58793: Calling all_inventory to load vars for managed_node3 29922 1726853664.58795: Calling groups_inventory to load vars for managed_node3 29922 1726853664.58797: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853664.58807: Calling all_plugins_play to load vars for managed_node3 29922 1726853664.58809: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853664.58812: Calling groups_plugins_play to load vars for managed_node3 29922 1726853664.59593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853664.60183: done sending task result for task 02083763-bbaf-51d4-513b-0000000003d5 29922 1726853664.60187: WORKER PROCESS EXITING 29922 1726853664.60230: done with get_vars() 29922 1726853664.60253: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:34:24 -0400 (0:00:02.233) 0:00:13.533 ****** 29922 1726853664.60346: entering _queue_task() for managed_node3/package_facts 29922 1726853664.60348: Creating lock for package_facts 29922 1726853664.60649: worker is 1 (out of 1 available) 29922 1726853664.60661: exiting _queue_task() for managed_node3/package_facts 29922 1726853664.60675: done queuing things up, now waiting for results queue to drain 29922 1726853664.60676: waiting for pending results... 29922 1726853664.61389: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 29922 1726853664.61529: in run() - task 02083763-bbaf-51d4-513b-0000000003d6 29922 1726853664.61594: variable 'ansible_search_path' from source: unknown 29922 1726853664.61603: variable 'ansible_search_path' from source: unknown 29922 1726853664.61780: calling self._execute() 29922 1726853664.61890: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853664.61904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853664.61920: variable 'omit' from source: magic vars 29922 1726853664.62434: variable 'ansible_distribution_major_version' from source: facts 29922 1726853664.62452: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853664.62465: variable 'omit' from source: magic vars 29922 1726853664.62541: variable 'omit' from source: magic vars 29922 1726853664.62582: variable 'omit' from source: magic vars 29922 1726853664.62625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853664.62669: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853664.62695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853664.62717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853664.62733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853664.62773: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853664.62782: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853664.62790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853664.62895: Set connection var ansible_connection to ssh 29922 1726853664.62908: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853664.62920: Set connection var ansible_shell_executable to /bin/sh 29922 1726853664.62932: Set connection var ansible_pipelining to False 29922 1726853664.62942: Set connection var ansible_timeout to 10 29922 1726853664.62948: Set connection var ansible_shell_type to sh 29922 1726853664.62981: variable 'ansible_shell_executable' from source: unknown 29922 1726853664.62990: variable 'ansible_connection' from source: unknown 29922 1726853664.62997: variable 'ansible_module_compression' from source: unknown 29922 1726853664.63004: variable 'ansible_shell_type' from source: unknown 29922 1726853664.63010: variable 'ansible_shell_executable' from source: unknown 29922 1726853664.63017: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853664.63024: variable 'ansible_pipelining' from source: unknown 29922 1726853664.63030: variable 'ansible_timeout' from source: unknown 29922 1726853664.63039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853664.63237: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853664.63286: variable 'omit' from source: magic vars 29922 1726853664.63289: starting attempt loop 29922 1726853664.63292: running the handler 29922 1726853664.63294: _low_level_execute_command(): starting 29922 1726853664.63298: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853664.64164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853664.64279: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853664.64301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853664.64339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853664.64402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853664.66124: stdout chunk (state=3): >>>/root <<< 29922 1726853664.66502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853664.66505: stdout chunk (state=3): >>><<< 29922 1726853664.66508: stderr chunk (state=3): >>><<< 29922 1726853664.66512: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853664.66514: _low_level_execute_command(): starting 29922 1726853664.66517: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853664.6648014-30596-183193387767727 `" && echo ansible-tmp-1726853664.6648014-30596-183193387767727="` echo /root/.ansible/tmp/ansible-tmp-1726853664.6648014-30596-183193387767727 `" ) && sleep 0' 29922 1726853664.67608: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853664.67877: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853664.67916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853664.67973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853664.70083: stdout chunk (state=3): >>>ansible-tmp-1726853664.6648014-30596-183193387767727=/root/.ansible/tmp/ansible-tmp-1726853664.6648014-30596-183193387767727 <<< 29922 1726853664.70088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853664.70206: stderr chunk (state=3): >>><<< 29922 1726853664.70210: stdout chunk (state=3): >>><<< 29922 1726853664.70213: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853664.6648014-30596-183193387767727=/root/.ansible/tmp/ansible-tmp-1726853664.6648014-30596-183193387767727 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853664.70216: variable 'ansible_module_compression' from source: unknown 29922 1726853664.70277: ANSIBALLZ: Using lock for package_facts 29922 1726853664.70321: ANSIBALLZ: Acquiring lock 29922 1726853664.70330: ANSIBALLZ: Lock acquired: 140376041479472 29922 1726853664.70423: ANSIBALLZ: Creating module 29922 1726853665.32078: ANSIBALLZ: Writing module into payload 29922 1726853665.32409: ANSIBALLZ: Writing module 29922 1726853665.32656: ANSIBALLZ: Renaming module 29922 1726853665.32662: ANSIBALLZ: Done creating module 29922 1726853665.32664: variable 'ansible_facts' from source: unknown 29922 1726853665.32876: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853664.6648014-30596-183193387767727/AnsiballZ_package_facts.py 29922 1726853665.33520: Sending initial data 29922 1726853665.33529: Sent initial data (162 bytes) 29922 1726853665.34689: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853665.34709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853665.34725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853665.34787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853665.34976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853665.35142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853665.35241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853665.36908: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853665.36965: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853665.37023: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpsz4ka6d2 /root/.ansible/tmp/ansible-tmp-1726853664.6648014-30596-183193387767727/AnsiballZ_package_facts.py <<< 29922 1726853665.37027: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853664.6648014-30596-183193387767727/AnsiballZ_package_facts.py" <<< 29922 1726853665.37102: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpsz4ka6d2" to remote "/root/.ansible/tmp/ansible-tmp-1726853664.6648014-30596-183193387767727/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853664.6648014-30596-183193387767727/AnsiballZ_package_facts.py" <<< 29922 1726853665.39401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853665.39416: stdout chunk (state=3): >>><<< 29922 1726853665.39445: stderr chunk (state=3): >>><<< 29922 1726853665.39715: done transferring module to remote 29922 1726853665.39718: _low_level_execute_command(): starting 29922 1726853665.39720: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853664.6648014-30596-183193387767727/ /root/.ansible/tmp/ansible-tmp-1726853664.6648014-30596-183193387767727/AnsiballZ_package_facts.py && sleep 0' 29922 1726853665.41045: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853665.41125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853665.41168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853665.41190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853665.41282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853665.43214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853665.43267: stderr chunk (state=3): >>><<< 29922 1726853665.43270: stdout chunk (state=3): >>><<< 29922 1726853665.43332: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853665.43335: _low_level_execute_command(): starting 29922 1726853665.43338: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853664.6648014-30596-183193387767727/AnsiballZ_package_facts.py && sleep 0' 29922 1726853665.43991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853665.44079: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853665.44109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853665.44128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853665.44226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853665.88730: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 29922 1726853665.88953: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 29922 1726853665.89021: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 29922 1726853665.89029: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 29922 1726853665.90958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853665.90961: stdout chunk (state=3): >>><<< 29922 1726853665.90964: stderr chunk (state=3): >>><<< 29922 1726853665.91157: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853665.95996: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853664.6648014-30596-183193387767727/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853665.96033: _low_level_execute_command(): starting 29922 1726853665.96042: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853664.6648014-30596-183193387767727/ > /dev/null 2>&1 && sleep 0' 29922 1726853665.96685: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853665.96699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853665.96783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853665.96819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853665.96834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853665.96854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853665.97001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853665.98916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853665.98929: stdout chunk (state=3): >>><<< 29922 1726853665.98943: stderr chunk (state=3): >>><<< 29922 1726853665.98969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853665.98983: handler run complete 29922 1726853665.99927: variable 'ansible_facts' from source: unknown 29922 1726853666.00608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853666.08799: variable 'ansible_facts' from source: unknown 29922 1726853666.09038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853666.09703: attempt loop complete, returning result 29922 1726853666.09707: _execute() done 29922 1726853666.09710: dumping result to json 29922 1726853666.09888: done dumping result, returning 29922 1726853666.09903: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-51d4-513b-0000000003d6] 29922 1726853666.09915: sending task result for task 02083763-bbaf-51d4-513b-0000000003d6 29922 1726853666.11850: done sending task result for task 02083763-bbaf-51d4-513b-0000000003d6 29922 1726853666.11852: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29922 1726853666.11958: no more pending results, returning what we have 29922 1726853666.11960: results queue empty 29922 1726853666.11961: checking for any_errors_fatal 29922 1726853666.11965: done checking for any_errors_fatal 29922 1726853666.11965: checking for max_fail_percentage 29922 1726853666.11966: done checking for max_fail_percentage 29922 1726853666.11967: checking to see if all hosts have failed and the running result is not ok 29922 1726853666.11967: done checking to see if all hosts have failed 29922 1726853666.11968: getting the remaining hosts for this loop 29922 1726853666.11968: done getting the remaining hosts for this loop 29922 1726853666.11972: getting the next task for host managed_node3 29922 1726853666.11977: done getting next task for host managed_node3 29922 1726853666.11979: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 29922 1726853666.11981: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853666.11987: getting variables 29922 1726853666.11988: in VariableManager get_vars() 29922 1726853666.12009: Calling all_inventory to load vars for managed_node3 29922 1726853666.12011: Calling groups_inventory to load vars for managed_node3 29922 1726853666.12012: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853666.12018: Calling all_plugins_play to load vars for managed_node3 29922 1726853666.12020: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853666.12021: Calling groups_plugins_play to load vars for managed_node3 29922 1726853666.13008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853666.15481: done with get_vars() 29922 1726853666.15511: done getting variables 29922 1726853666.15576: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:34:26 -0400 (0:00:01.552) 0:00:15.085 ****** 29922 1726853666.15613: entering _queue_task() for managed_node3/debug 29922 1726853666.16361: worker is 1 (out of 1 available) 29922 1726853666.16376: exiting _queue_task() for managed_node3/debug 29922 1726853666.16388: done queuing things up, now waiting for results queue to drain 29922 1726853666.16389: waiting for pending results... 29922 1726853666.17064: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 29922 1726853666.17377: in run() - task 02083763-bbaf-51d4-513b-000000000018 29922 1726853666.17382: variable 'ansible_search_path' from source: unknown 29922 1726853666.17385: variable 'ansible_search_path' from source: unknown 29922 1726853666.17388: calling self._execute() 29922 1726853666.17544: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853666.17602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853666.17804: variable 'omit' from source: magic vars 29922 1726853666.18540: variable 'ansible_distribution_major_version' from source: facts 29922 1726853666.18601: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853666.18615: variable 'omit' from source: magic vars 29922 1726853666.18745: variable 'omit' from source: magic vars 29922 1726853666.19126: variable 'network_provider' from source: set_fact 29922 1726853666.19130: variable 'omit' from source: magic vars 29922 1726853666.19133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853666.19276: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853666.19307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853666.19332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853666.19375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853666.19492: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853666.19508: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853666.19518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853666.19828: Set connection var ansible_connection to ssh 29922 1726853666.19832: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853666.19835: Set connection var ansible_shell_executable to /bin/sh 29922 1726853666.19837: Set connection var ansible_pipelining to False 29922 1726853666.19849: Set connection var ansible_timeout to 10 29922 1726853666.19856: Set connection var ansible_shell_type to sh 29922 1726853666.19914: variable 'ansible_shell_executable' from source: unknown 29922 1726853666.19936: variable 'ansible_connection' from source: unknown 29922 1726853666.19939: variable 'ansible_module_compression' from source: unknown 29922 1726853666.19942: variable 'ansible_shell_type' from source: unknown 29922 1726853666.19995: variable 'ansible_shell_executable' from source: unknown 29922 1726853666.19998: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853666.20001: variable 'ansible_pipelining' from source: unknown 29922 1726853666.20008: variable 'ansible_timeout' from source: unknown 29922 1726853666.20011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853666.20146: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853666.20173: variable 'omit' from source: magic vars 29922 1726853666.20184: starting attempt loop 29922 1726853666.20212: running the handler 29922 1726853666.20249: handler run complete 29922 1726853666.20278: attempt loop complete, returning result 29922 1726853666.20321: _execute() done 29922 1726853666.20324: dumping result to json 29922 1726853666.20327: done dumping result, returning 29922 1726853666.20334: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-51d4-513b-000000000018] 29922 1726853666.20336: sending task result for task 02083763-bbaf-51d4-513b-000000000018 ok: [managed_node3] => {} MSG: Using network provider: nm 29922 1726853666.20608: no more pending results, returning what we have 29922 1726853666.20612: results queue empty 29922 1726853666.20613: checking for any_errors_fatal 29922 1726853666.20626: done checking for any_errors_fatal 29922 1726853666.20627: checking for max_fail_percentage 29922 1726853666.20629: done checking for max_fail_percentage 29922 1726853666.20630: checking to see if all hosts have failed and the running result is not ok 29922 1726853666.20631: done checking to see if all hosts have failed 29922 1726853666.20632: getting the remaining hosts for this loop 29922 1726853666.20634: done getting the remaining hosts for this loop 29922 1726853666.20638: getting the next task for host managed_node3 29922 1726853666.20646: done getting next task for host managed_node3 29922 1726853666.20651: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 29922 1726853666.20654: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853666.20667: getting variables 29922 1726853666.20668: in VariableManager get_vars() 29922 1726853666.20810: Calling all_inventory to load vars for managed_node3 29922 1726853666.20814: Calling groups_inventory to load vars for managed_node3 29922 1726853666.20816: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853666.20830: Calling all_plugins_play to load vars for managed_node3 29922 1726853666.20833: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853666.20836: Calling groups_plugins_play to load vars for managed_node3 29922 1726853666.21536: done sending task result for task 02083763-bbaf-51d4-513b-000000000018 29922 1726853666.21539: WORKER PROCESS EXITING 29922 1726853666.22420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853666.24262: done with get_vars() 29922 1726853666.24296: done getting variables 29922 1726853666.24355: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:34:26 -0400 (0:00:00.087) 0:00:15.173 ****** 29922 1726853666.24400: entering _queue_task() for managed_node3/fail 29922 1726853666.25002: worker is 1 (out of 1 available) 29922 1726853666.25010: exiting _queue_task() for managed_node3/fail 29922 1726853666.25021: done queuing things up, now waiting for results queue to drain 29922 1726853666.25022: waiting for pending results... 29922 1726853666.25084: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 29922 1726853666.25234: in run() - task 02083763-bbaf-51d4-513b-000000000019 29922 1726853666.25269: variable 'ansible_search_path' from source: unknown 29922 1726853666.25280: variable 'ansible_search_path' from source: unknown 29922 1726853666.25321: calling self._execute() 29922 1726853666.25421: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853666.25434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853666.25449: variable 'omit' from source: magic vars 29922 1726853666.25839: variable 'ansible_distribution_major_version' from source: facts 29922 1726853666.25856: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853666.25989: variable 'network_state' from source: role '' defaults 29922 1726853666.26018: Evaluated conditional (network_state != {}): False 29922 1726853666.26027: when evaluation is False, skipping this task 29922 1726853666.26035: _execute() done 29922 1726853666.26043: dumping result to json 29922 1726853666.26051: done dumping result, returning 29922 1726853666.26063: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-51d4-513b-000000000019] 29922 1726853666.26077: sending task result for task 02083763-bbaf-51d4-513b-000000000019 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29922 1726853666.26280: no more pending results, returning what we have 29922 1726853666.26285: results queue empty 29922 1726853666.26285: checking for any_errors_fatal 29922 1726853666.26294: done checking for any_errors_fatal 29922 1726853666.26295: checking for max_fail_percentage 29922 1726853666.26296: done checking for max_fail_percentage 29922 1726853666.26298: checking to see if all hosts have failed and the running result is not ok 29922 1726853666.26299: done checking to see if all hosts have failed 29922 1726853666.26299: getting the remaining hosts for this loop 29922 1726853666.26300: done getting the remaining hosts for this loop 29922 1726853666.26304: getting the next task for host managed_node3 29922 1726853666.26313: done getting next task for host managed_node3 29922 1726853666.26317: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 29922 1726853666.26321: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853666.26350: getting variables 29922 1726853666.26354: in VariableManager get_vars() 29922 1726853666.26399: Calling all_inventory to load vars for managed_node3 29922 1726853666.26402: Calling groups_inventory to load vars for managed_node3 29922 1726853666.26404: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853666.26417: Calling all_plugins_play to load vars for managed_node3 29922 1726853666.26420: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853666.26423: Calling groups_plugins_play to load vars for managed_node3 29922 1726853666.27095: done sending task result for task 02083763-bbaf-51d4-513b-000000000019 29922 1726853666.27099: WORKER PROCESS EXITING 29922 1726853666.28325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853666.29921: done with get_vars() 29922 1726853666.29951: done getting variables 29922 1726853666.30009: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:34:26 -0400 (0:00:00.056) 0:00:15.229 ****** 29922 1726853666.30041: entering _queue_task() for managed_node3/fail 29922 1726853666.30386: worker is 1 (out of 1 available) 29922 1726853666.30399: exiting _queue_task() for managed_node3/fail 29922 1726853666.30410: done queuing things up, now waiting for results queue to drain 29922 1726853666.30411: waiting for pending results... 29922 1726853666.30792: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 29922 1726853666.30813: in run() - task 02083763-bbaf-51d4-513b-00000000001a 29922 1726853666.30833: variable 'ansible_search_path' from source: unknown 29922 1726853666.30840: variable 'ansible_search_path' from source: unknown 29922 1726853666.30879: calling self._execute() 29922 1726853666.30967: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853666.30981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853666.30997: variable 'omit' from source: magic vars 29922 1726853666.31581: variable 'ansible_distribution_major_version' from source: facts 29922 1726853666.31585: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853666.31835: variable 'network_state' from source: role '' defaults 29922 1726853666.31846: Evaluated conditional (network_state != {}): False 29922 1726853666.31849: when evaluation is False, skipping this task 29922 1726853666.31853: _execute() done 29922 1726853666.31856: dumping result to json 29922 1726853666.31858: done dumping result, returning 29922 1726853666.31867: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-51d4-513b-00000000001a] 29922 1726853666.31874: sending task result for task 02083763-bbaf-51d4-513b-00000000001a 29922 1726853666.31992: done sending task result for task 02083763-bbaf-51d4-513b-00000000001a 29922 1726853666.31995: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29922 1726853666.32051: no more pending results, returning what we have 29922 1726853666.32055: results queue empty 29922 1726853666.32056: checking for any_errors_fatal 29922 1726853666.32064: done checking for any_errors_fatal 29922 1726853666.32065: checking for max_fail_percentage 29922 1726853666.32066: done checking for max_fail_percentage 29922 1726853666.32067: checking to see if all hosts have failed and the running result is not ok 29922 1726853666.32068: done checking to see if all hosts have failed 29922 1726853666.32069: getting the remaining hosts for this loop 29922 1726853666.32070: done getting the remaining hosts for this loop 29922 1726853666.32076: getting the next task for host managed_node3 29922 1726853666.32083: done getting next task for host managed_node3 29922 1726853666.32087: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 29922 1726853666.32090: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853666.32107: getting variables 29922 1726853666.32108: in VariableManager get_vars() 29922 1726853666.32146: Calling all_inventory to load vars for managed_node3 29922 1726853666.32148: Calling groups_inventory to load vars for managed_node3 29922 1726853666.32151: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853666.32163: Calling all_plugins_play to load vars for managed_node3 29922 1726853666.32166: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853666.32168: Calling groups_plugins_play to load vars for managed_node3 29922 1726853666.33627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853666.34506: done with get_vars() 29922 1726853666.34525: done getting variables 29922 1726853666.34572: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:34:26 -0400 (0:00:00.045) 0:00:15.275 ****** 29922 1726853666.34598: entering _queue_task() for managed_node3/fail 29922 1726853666.34849: worker is 1 (out of 1 available) 29922 1726853666.34864: exiting _queue_task() for managed_node3/fail 29922 1726853666.34878: done queuing things up, now waiting for results queue to drain 29922 1726853666.34880: waiting for pending results... 29922 1726853666.35061: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 29922 1726853666.35152: in run() - task 02083763-bbaf-51d4-513b-00000000001b 29922 1726853666.35167: variable 'ansible_search_path' from source: unknown 29922 1726853666.35173: variable 'ansible_search_path' from source: unknown 29922 1726853666.35199: calling self._execute() 29922 1726853666.35268: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853666.35273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853666.35282: variable 'omit' from source: magic vars 29922 1726853666.35556: variable 'ansible_distribution_major_version' from source: facts 29922 1726853666.35569: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853666.35693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853666.38328: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853666.38416: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853666.38474: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853666.38478: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853666.38610: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853666.38883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.38887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.38889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.38891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.38893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.39097: variable 'ansible_distribution_major_version' from source: facts 29922 1726853666.39115: Evaluated conditional (ansible_distribution_major_version | int > 9): True 29922 1726853666.39339: variable 'ansible_distribution' from source: facts 29922 1726853666.39342: variable '__network_rh_distros' from source: role '' defaults 29922 1726853666.39353: Evaluated conditional (ansible_distribution in __network_rh_distros): True 29922 1726853666.40031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.40034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.40037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.40039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.40042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.40073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.40137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.40195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.40234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.40249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.40294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.40316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.40339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.40582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.40585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.40777: variable 'network_connections' from source: task vars 29922 1726853666.40780: variable 'interface' from source: set_fact 29922 1726853666.40833: variable 'interface' from source: set_fact 29922 1726853666.40846: variable 'interface' from source: set_fact 29922 1726853666.40912: variable 'interface' from source: set_fact 29922 1726853666.40944: variable 'network_state' from source: role '' defaults 29922 1726853666.41010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853666.41181: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853666.41223: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853666.41259: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853666.41291: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853666.41340: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853666.41379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853666.41408: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.41437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853666.41482: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 29922 1726853666.41491: when evaluation is False, skipping this task 29922 1726853666.41498: _execute() done 29922 1726853666.41506: dumping result to json 29922 1726853666.41513: done dumping result, returning 29922 1726853666.41524: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-51d4-513b-00000000001b] 29922 1726853666.41534: sending task result for task 02083763-bbaf-51d4-513b-00000000001b 29922 1726853666.41639: done sending task result for task 02083763-bbaf-51d4-513b-00000000001b 29922 1726853666.41647: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 29922 1726853666.41704: no more pending results, returning what we have 29922 1726853666.41708: results queue empty 29922 1726853666.41708: checking for any_errors_fatal 29922 1726853666.41715: done checking for any_errors_fatal 29922 1726853666.41716: checking for max_fail_percentage 29922 1726853666.41717: done checking for max_fail_percentage 29922 1726853666.41718: checking to see if all hosts have failed and the running result is not ok 29922 1726853666.41719: done checking to see if all hosts have failed 29922 1726853666.41719: getting the remaining hosts for this loop 29922 1726853666.41721: done getting the remaining hosts for this loop 29922 1726853666.41724: getting the next task for host managed_node3 29922 1726853666.41730: done getting next task for host managed_node3 29922 1726853666.41734: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 29922 1726853666.41736: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853666.41754: getting variables 29922 1726853666.41755: in VariableManager get_vars() 29922 1726853666.41794: Calling all_inventory to load vars for managed_node3 29922 1726853666.41796: Calling groups_inventory to load vars for managed_node3 29922 1726853666.41798: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853666.41808: Calling all_plugins_play to load vars for managed_node3 29922 1726853666.41810: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853666.41812: Calling groups_plugins_play to load vars for managed_node3 29922 1726853666.44491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853666.46450: done with get_vars() 29922 1726853666.46479: done getting variables 29922 1726853666.46588: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:34:26 -0400 (0:00:00.120) 0:00:15.395 ****** 29922 1726853666.46625: entering _queue_task() for managed_node3/dnf 29922 1726853666.47078: worker is 1 (out of 1 available) 29922 1726853666.47092: exiting _queue_task() for managed_node3/dnf 29922 1726853666.47102: done queuing things up, now waiting for results queue to drain 29922 1726853666.47104: waiting for pending results... 29922 1726853666.47716: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 29922 1726853666.47965: in run() - task 02083763-bbaf-51d4-513b-00000000001c 29922 1726853666.47990: variable 'ansible_search_path' from source: unknown 29922 1726853666.47994: variable 'ansible_search_path' from source: unknown 29922 1726853666.48077: calling self._execute() 29922 1726853666.48314: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853666.48390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853666.48399: variable 'omit' from source: magic vars 29922 1726853666.48965: variable 'ansible_distribution_major_version' from source: facts 29922 1726853666.48979: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853666.49186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853666.50852: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853666.51147: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853666.51176: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853666.51208: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853666.51224: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853666.51288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.51309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.51328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.51354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.51369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.51449: variable 'ansible_distribution' from source: facts 29922 1726853666.51453: variable 'ansible_distribution_major_version' from source: facts 29922 1726853666.51468: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 29922 1726853666.51548: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853666.51638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.51656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.51677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.51702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.51713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.51739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.51758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.51778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.51802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.51814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.51842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.51859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.51882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.51905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.51919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.52015: variable 'network_connections' from source: task vars 29922 1726853666.52028: variable 'interface' from source: set_fact 29922 1726853666.52076: variable 'interface' from source: set_fact 29922 1726853666.52088: variable 'interface' from source: set_fact 29922 1726853666.52152: variable 'interface' from source: set_fact 29922 1726853666.52268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853666.52439: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853666.52493: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853666.52541: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853666.52578: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853666.52638: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853666.52663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853666.52702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.52744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853666.52806: variable '__network_team_connections_defined' from source: role '' defaults 29922 1726853666.53093: variable 'network_connections' from source: task vars 29922 1726853666.53137: variable 'interface' from source: set_fact 29922 1726853666.53176: variable 'interface' from source: set_fact 29922 1726853666.53189: variable 'interface' from source: set_fact 29922 1726853666.53248: variable 'interface' from source: set_fact 29922 1726853666.53293: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29922 1726853666.53296: when evaluation is False, skipping this task 29922 1726853666.53299: _execute() done 29922 1726853666.53302: dumping result to json 29922 1726853666.53304: done dumping result, returning 29922 1726853666.53311: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-51d4-513b-00000000001c] 29922 1726853666.53315: sending task result for task 02083763-bbaf-51d4-513b-00000000001c skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29922 1726853666.53451: no more pending results, returning what we have 29922 1726853666.53454: results queue empty 29922 1726853666.53455: checking for any_errors_fatal 29922 1726853666.53461: done checking for any_errors_fatal 29922 1726853666.53461: checking for max_fail_percentage 29922 1726853666.53463: done checking for max_fail_percentage 29922 1726853666.53464: checking to see if all hosts have failed and the running result is not ok 29922 1726853666.53464: done checking to see if all hosts have failed 29922 1726853666.53465: getting the remaining hosts for this loop 29922 1726853666.53466: done getting the remaining hosts for this loop 29922 1726853666.53470: getting the next task for host managed_node3 29922 1726853666.53477: done getting next task for host managed_node3 29922 1726853666.53481: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 29922 1726853666.53484: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853666.53499: getting variables 29922 1726853666.53501: in VariableManager get_vars() 29922 1726853666.53535: Calling all_inventory to load vars for managed_node3 29922 1726853666.53537: Calling groups_inventory to load vars for managed_node3 29922 1726853666.53539: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853666.53549: Calling all_plugins_play to load vars for managed_node3 29922 1726853666.53552: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853666.53554: Calling groups_plugins_play to load vars for managed_node3 29922 1726853666.54085: done sending task result for task 02083763-bbaf-51d4-513b-00000000001c 29922 1726853666.54089: WORKER PROCESS EXITING 29922 1726853666.54518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853666.55384: done with get_vars() 29922 1726853666.55400: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 29922 1726853666.55459: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:34:26 -0400 (0:00:00.088) 0:00:15.484 ****** 29922 1726853666.55482: entering _queue_task() for managed_node3/yum 29922 1726853666.55483: Creating lock for yum 29922 1726853666.55731: worker is 1 (out of 1 available) 29922 1726853666.55744: exiting _queue_task() for managed_node3/yum 29922 1726853666.55760: done queuing things up, now waiting for results queue to drain 29922 1726853666.55761: waiting for pending results... 29922 1726853666.55945: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 29922 1726853666.56036: in run() - task 02083763-bbaf-51d4-513b-00000000001d 29922 1726853666.56048: variable 'ansible_search_path' from source: unknown 29922 1726853666.56052: variable 'ansible_search_path' from source: unknown 29922 1726853666.56086: calling self._execute() 29922 1726853666.56150: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853666.56154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853666.56179: variable 'omit' from source: magic vars 29922 1726853666.56576: variable 'ansible_distribution_major_version' from source: facts 29922 1726853666.56580: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853666.56707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853666.58558: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853666.58613: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853666.58640: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853666.58668: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853666.58692: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853666.58750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.58772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.58792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.58820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.58831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.58904: variable 'ansible_distribution_major_version' from source: facts 29922 1726853666.58918: Evaluated conditional (ansible_distribution_major_version | int < 8): False 29922 1726853666.58921: when evaluation is False, skipping this task 29922 1726853666.58923: _execute() done 29922 1726853666.58926: dumping result to json 29922 1726853666.58929: done dumping result, returning 29922 1726853666.58936: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-51d4-513b-00000000001d] 29922 1726853666.58941: sending task result for task 02083763-bbaf-51d4-513b-00000000001d 29922 1726853666.59030: done sending task result for task 02083763-bbaf-51d4-513b-00000000001d 29922 1726853666.59034: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 29922 1726853666.59090: no more pending results, returning what we have 29922 1726853666.59094: results queue empty 29922 1726853666.59095: checking for any_errors_fatal 29922 1726853666.59100: done checking for any_errors_fatal 29922 1726853666.59101: checking for max_fail_percentage 29922 1726853666.59102: done checking for max_fail_percentage 29922 1726853666.59103: checking to see if all hosts have failed and the running result is not ok 29922 1726853666.59104: done checking to see if all hosts have failed 29922 1726853666.59105: getting the remaining hosts for this loop 29922 1726853666.59106: done getting the remaining hosts for this loop 29922 1726853666.59109: getting the next task for host managed_node3 29922 1726853666.59116: done getting next task for host managed_node3 29922 1726853666.59119: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 29922 1726853666.59122: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853666.59135: getting variables 29922 1726853666.59137: in VariableManager get_vars() 29922 1726853666.59186: Calling all_inventory to load vars for managed_node3 29922 1726853666.59188: Calling groups_inventory to load vars for managed_node3 29922 1726853666.59190: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853666.59200: Calling all_plugins_play to load vars for managed_node3 29922 1726853666.59202: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853666.59204: Calling groups_plugins_play to load vars for managed_node3 29922 1726853666.60025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853666.61142: done with get_vars() 29922 1726853666.61167: done getting variables 29922 1726853666.61232: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:34:26 -0400 (0:00:00.057) 0:00:15.542 ****** 29922 1726853666.61266: entering _queue_task() for managed_node3/fail 29922 1726853666.61615: worker is 1 (out of 1 available) 29922 1726853666.61630: exiting _queue_task() for managed_node3/fail 29922 1726853666.61642: done queuing things up, now waiting for results queue to drain 29922 1726853666.61643: waiting for pending results... 29922 1726853666.61938: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 29922 1726853666.62033: in run() - task 02083763-bbaf-51d4-513b-00000000001e 29922 1726853666.62049: variable 'ansible_search_path' from source: unknown 29922 1726853666.62053: variable 'ansible_search_path' from source: unknown 29922 1726853666.62148: calling self._execute() 29922 1726853666.62167: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853666.62175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853666.62187: variable 'omit' from source: magic vars 29922 1726853666.62520: variable 'ansible_distribution_major_version' from source: facts 29922 1726853666.62562: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853666.62708: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853666.62923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853666.68318: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853666.68367: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853666.68395: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853666.68418: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853666.68436: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853666.68490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.68508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.68525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.68550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.68562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.68598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.68614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.68630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.68658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.68667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.68700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.68715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.68731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.68757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.68766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.68875: variable 'network_connections' from source: task vars 29922 1726853666.68883: variable 'interface' from source: set_fact 29922 1726853666.68935: variable 'interface' from source: set_fact 29922 1726853666.68942: variable 'interface' from source: set_fact 29922 1726853666.68986: variable 'interface' from source: set_fact 29922 1726853666.69047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853666.69163: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853666.69189: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853666.69210: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853666.69235: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853666.69265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853666.69282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853666.69298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.69315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853666.69354: variable '__network_team_connections_defined' from source: role '' defaults 29922 1726853666.69502: variable 'network_connections' from source: task vars 29922 1726853666.69505: variable 'interface' from source: set_fact 29922 1726853666.69550: variable 'interface' from source: set_fact 29922 1726853666.69553: variable 'interface' from source: set_fact 29922 1726853666.69597: variable 'interface' from source: set_fact 29922 1726853666.69631: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29922 1726853666.69634: when evaluation is False, skipping this task 29922 1726853666.69637: _execute() done 29922 1726853666.69639: dumping result to json 29922 1726853666.69641: done dumping result, returning 29922 1726853666.69647: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-51d4-513b-00000000001e] 29922 1726853666.69661: sending task result for task 02083763-bbaf-51d4-513b-00000000001e 29922 1726853666.69735: done sending task result for task 02083763-bbaf-51d4-513b-00000000001e 29922 1726853666.69738: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29922 1726853666.69821: no more pending results, returning what we have 29922 1726853666.69824: results queue empty 29922 1726853666.69825: checking for any_errors_fatal 29922 1726853666.69831: done checking for any_errors_fatal 29922 1726853666.69832: checking for max_fail_percentage 29922 1726853666.69834: done checking for max_fail_percentage 29922 1726853666.69835: checking to see if all hosts have failed and the running result is not ok 29922 1726853666.69835: done checking to see if all hosts have failed 29922 1726853666.69836: getting the remaining hosts for this loop 29922 1726853666.69837: done getting the remaining hosts for this loop 29922 1726853666.69841: getting the next task for host managed_node3 29922 1726853666.69846: done getting next task for host managed_node3 29922 1726853666.69850: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 29922 1726853666.69853: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853666.69872: getting variables 29922 1726853666.69874: in VariableManager get_vars() 29922 1726853666.69916: Calling all_inventory to load vars for managed_node3 29922 1726853666.69918: Calling groups_inventory to load vars for managed_node3 29922 1726853666.69920: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853666.69929: Calling all_plugins_play to load vars for managed_node3 29922 1726853666.69932: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853666.69934: Calling groups_plugins_play to load vars for managed_node3 29922 1726853666.73938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853666.75520: done with get_vars() 29922 1726853666.75598: done getting variables 29922 1726853666.75783: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:34:26 -0400 (0:00:00.146) 0:00:15.688 ****** 29922 1726853666.75931: entering _queue_task() for managed_node3/package 29922 1726853666.76269: worker is 1 (out of 1 available) 29922 1726853666.76285: exiting _queue_task() for managed_node3/package 29922 1726853666.76299: done queuing things up, now waiting for results queue to drain 29922 1726853666.76300: waiting for pending results... 29922 1726853666.76497: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 29922 1726853666.76630: in run() - task 02083763-bbaf-51d4-513b-00000000001f 29922 1726853666.76665: variable 'ansible_search_path' from source: unknown 29922 1726853666.76669: variable 'ansible_search_path' from source: unknown 29922 1726853666.76710: calling self._execute() 29922 1726853666.76784: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853666.76788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853666.76797: variable 'omit' from source: magic vars 29922 1726853666.77104: variable 'ansible_distribution_major_version' from source: facts 29922 1726853666.77114: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853666.77316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853666.77579: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853666.77630: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853666.77726: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853666.77764: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853666.77853: variable 'network_packages' from source: role '' defaults 29922 1726853666.77951: variable '__network_provider_setup' from source: role '' defaults 29922 1726853666.77957: variable '__network_service_name_default_nm' from source: role '' defaults 29922 1726853666.78019: variable '__network_service_name_default_nm' from source: role '' defaults 29922 1726853666.78028: variable '__network_packages_default_nm' from source: role '' defaults 29922 1726853666.78087: variable '__network_packages_default_nm' from source: role '' defaults 29922 1726853666.78197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853666.80375: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853666.80415: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853666.80458: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853666.80509: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853666.80554: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853666.80778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.80782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.80785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.80787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.80790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.80826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.80849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.80877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.80977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.80981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.81255: variable '__network_packages_default_gobject_packages' from source: role '' defaults 29922 1726853666.81329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.81367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.81403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.81451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.81508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.81611: variable 'ansible_python' from source: facts 29922 1726853666.81631: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 29922 1726853666.81700: variable '__network_wpa_supplicant_required' from source: role '' defaults 29922 1726853666.81792: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29922 1726853666.81898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.81919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.81943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.81987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.81997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.82077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853666.82098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853666.82121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.82167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853666.82190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853666.82336: variable 'network_connections' from source: task vars 29922 1726853666.82339: variable 'interface' from source: set_fact 29922 1726853666.82444: variable 'interface' from source: set_fact 29922 1726853666.82447: variable 'interface' from source: set_fact 29922 1726853666.82523: variable 'interface' from source: set_fact 29922 1726853666.82603: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853666.82624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853666.82646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853666.82668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853666.82706: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853666.82937: variable 'network_connections' from source: task vars 29922 1726853666.82942: variable 'interface' from source: set_fact 29922 1726853666.83011: variable 'interface' from source: set_fact 29922 1726853666.83018: variable 'interface' from source: set_fact 29922 1726853666.83109: variable 'interface' from source: set_fact 29922 1726853666.83197: variable '__network_packages_default_wireless' from source: role '' defaults 29922 1726853666.83254: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853666.83786: variable 'network_connections' from source: task vars 29922 1726853666.83789: variable 'interface' from source: set_fact 29922 1726853666.83792: variable 'interface' from source: set_fact 29922 1726853666.83794: variable 'interface' from source: set_fact 29922 1726853666.83797: variable 'interface' from source: set_fact 29922 1726853666.83881: variable '__network_packages_default_team' from source: role '' defaults 29922 1726853666.84000: variable '__network_team_connections_defined' from source: role '' defaults 29922 1726853666.84443: variable 'network_connections' from source: task vars 29922 1726853666.84451: variable 'interface' from source: set_fact 29922 1726853666.84539: variable 'interface' from source: set_fact 29922 1726853666.84545: variable 'interface' from source: set_fact 29922 1726853666.84637: variable 'interface' from source: set_fact 29922 1726853666.84733: variable '__network_service_name_default_initscripts' from source: role '' defaults 29922 1726853666.84826: variable '__network_service_name_default_initscripts' from source: role '' defaults 29922 1726853666.84853: variable '__network_packages_default_initscripts' from source: role '' defaults 29922 1726853666.84945: variable '__network_packages_default_initscripts' from source: role '' defaults 29922 1726853666.85298: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 29922 1726853666.86690: variable 'network_connections' from source: task vars 29922 1726853666.86737: variable 'interface' from source: set_fact 29922 1726853666.86889: variable 'interface' from source: set_fact 29922 1726853666.86895: variable 'interface' from source: set_fact 29922 1726853666.87086: variable 'interface' from source: set_fact 29922 1726853666.87115: variable 'ansible_distribution' from source: facts 29922 1726853666.87119: variable '__network_rh_distros' from source: role '' defaults 29922 1726853666.87125: variable 'ansible_distribution_major_version' from source: facts 29922 1726853666.87150: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 29922 1726853666.87619: variable 'ansible_distribution' from source: facts 29922 1726853666.87626: variable '__network_rh_distros' from source: role '' defaults 29922 1726853666.87629: variable 'ansible_distribution_major_version' from source: facts 29922 1726853666.87640: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 29922 1726853666.87983: variable 'ansible_distribution' from source: facts 29922 1726853666.87990: variable '__network_rh_distros' from source: role '' defaults 29922 1726853666.87996: variable 'ansible_distribution_major_version' from source: facts 29922 1726853666.88180: variable 'network_provider' from source: set_fact 29922 1726853666.88198: variable 'ansible_facts' from source: unknown 29922 1726853666.90714: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 29922 1726853666.90719: when evaluation is False, skipping this task 29922 1726853666.90721: _execute() done 29922 1726853666.90724: dumping result to json 29922 1726853666.90726: done dumping result, returning 29922 1726853666.90786: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-51d4-513b-00000000001f] 29922 1726853666.90789: sending task result for task 02083763-bbaf-51d4-513b-00000000001f 29922 1726853666.90967: done sending task result for task 02083763-bbaf-51d4-513b-00000000001f 29922 1726853666.90972: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 29922 1726853666.91035: no more pending results, returning what we have 29922 1726853666.91039: results queue empty 29922 1726853666.91040: checking for any_errors_fatal 29922 1726853666.91048: done checking for any_errors_fatal 29922 1726853666.91049: checking for max_fail_percentage 29922 1726853666.91051: done checking for max_fail_percentage 29922 1726853666.91052: checking to see if all hosts have failed and the running result is not ok 29922 1726853666.91053: done checking to see if all hosts have failed 29922 1726853666.91053: getting the remaining hosts for this loop 29922 1726853666.91057: done getting the remaining hosts for this loop 29922 1726853666.91061: getting the next task for host managed_node3 29922 1726853666.91068: done getting next task for host managed_node3 29922 1726853666.91074: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 29922 1726853666.91077: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853666.91103: getting variables 29922 1726853666.91104: in VariableManager get_vars() 29922 1726853666.91349: Calling all_inventory to load vars for managed_node3 29922 1726853666.91352: Calling groups_inventory to load vars for managed_node3 29922 1726853666.91373: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853666.91385: Calling all_plugins_play to load vars for managed_node3 29922 1726853666.91387: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853666.91390: Calling groups_plugins_play to load vars for managed_node3 29922 1726853666.94630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853666.98763: done with get_vars() 29922 1726853666.98788: done getting variables 29922 1726853666.98962: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:34:26 -0400 (0:00:00.231) 0:00:15.920 ****** 29922 1726853666.99059: entering _queue_task() for managed_node3/package 29922 1726853666.99982: worker is 1 (out of 1 available) 29922 1726853666.99991: exiting _queue_task() for managed_node3/package 29922 1726853667.00001: done queuing things up, now waiting for results queue to drain 29922 1726853667.00002: waiting for pending results... 29922 1726853667.00473: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 29922 1726853667.00881: in run() - task 02083763-bbaf-51d4-513b-000000000020 29922 1726853667.00885: variable 'ansible_search_path' from source: unknown 29922 1726853667.00887: variable 'ansible_search_path' from source: unknown 29922 1726853667.00889: calling self._execute() 29922 1726853667.01106: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853667.01112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853667.01121: variable 'omit' from source: magic vars 29922 1726853667.02365: variable 'ansible_distribution_major_version' from source: facts 29922 1726853667.02377: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853667.02826: variable 'network_state' from source: role '' defaults 29922 1726853667.02830: Evaluated conditional (network_state != {}): False 29922 1726853667.02832: when evaluation is False, skipping this task 29922 1726853667.02836: _execute() done 29922 1726853667.02838: dumping result to json 29922 1726853667.02840: done dumping result, returning 29922 1726853667.02842: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-51d4-513b-000000000020] 29922 1726853667.02845: sending task result for task 02083763-bbaf-51d4-513b-000000000020 29922 1726853667.03010: done sending task result for task 02083763-bbaf-51d4-513b-000000000020 29922 1726853667.03014: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29922 1726853667.03079: no more pending results, returning what we have 29922 1726853667.03083: results queue empty 29922 1726853667.03084: checking for any_errors_fatal 29922 1726853667.03091: done checking for any_errors_fatal 29922 1726853667.03092: checking for max_fail_percentage 29922 1726853667.03094: done checking for max_fail_percentage 29922 1726853667.03095: checking to see if all hosts have failed and the running result is not ok 29922 1726853667.03097: done checking to see if all hosts have failed 29922 1726853667.03098: getting the remaining hosts for this loop 29922 1726853667.03099: done getting the remaining hosts for this loop 29922 1726853667.03103: getting the next task for host managed_node3 29922 1726853667.03110: done getting next task for host managed_node3 29922 1726853667.03114: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 29922 1726853667.03118: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853667.03141: getting variables 29922 1726853667.03278: in VariableManager get_vars() 29922 1726853667.03319: Calling all_inventory to load vars for managed_node3 29922 1726853667.03322: Calling groups_inventory to load vars for managed_node3 29922 1726853667.03324: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853667.03337: Calling all_plugins_play to load vars for managed_node3 29922 1726853667.03378: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853667.03386: Calling groups_plugins_play to load vars for managed_node3 29922 1726853667.06807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853667.10635: done with get_vars() 29922 1726853667.10666: done getting variables 29922 1726853667.11035: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:34:27 -0400 (0:00:00.120) 0:00:16.040 ****** 29922 1726853667.11068: entering _queue_task() for managed_node3/package 29922 1726853667.12121: worker is 1 (out of 1 available) 29922 1726853667.12133: exiting _queue_task() for managed_node3/package 29922 1726853667.12143: done queuing things up, now waiting for results queue to drain 29922 1726853667.12144: waiting for pending results... 29922 1726853667.12889: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 29922 1726853667.12924: in run() - task 02083763-bbaf-51d4-513b-000000000021 29922 1726853667.12939: variable 'ansible_search_path' from source: unknown 29922 1726853667.12943: variable 'ansible_search_path' from source: unknown 29922 1726853667.12984: calling self._execute() 29922 1726853667.13302: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853667.13309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853667.13319: variable 'omit' from source: magic vars 29922 1726853667.13991: variable 'ansible_distribution_major_version' from source: facts 29922 1726853667.14002: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853667.14239: variable 'network_state' from source: role '' defaults 29922 1726853667.14250: Evaluated conditional (network_state != {}): False 29922 1726853667.14253: when evaluation is False, skipping this task 29922 1726853667.14255: _execute() done 29922 1726853667.14261: dumping result to json 29922 1726853667.14264: done dumping result, returning 29922 1726853667.14424: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-51d4-513b-000000000021] 29922 1726853667.14428: sending task result for task 02083763-bbaf-51d4-513b-000000000021 29922 1726853667.14692: done sending task result for task 02083763-bbaf-51d4-513b-000000000021 29922 1726853667.14695: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29922 1726853667.14743: no more pending results, returning what we have 29922 1726853667.14747: results queue empty 29922 1726853667.14748: checking for any_errors_fatal 29922 1726853667.14756: done checking for any_errors_fatal 29922 1726853667.14757: checking for max_fail_percentage 29922 1726853667.14759: done checking for max_fail_percentage 29922 1726853667.14760: checking to see if all hosts have failed and the running result is not ok 29922 1726853667.14761: done checking to see if all hosts have failed 29922 1726853667.14762: getting the remaining hosts for this loop 29922 1726853667.14763: done getting the remaining hosts for this loop 29922 1726853667.14767: getting the next task for host managed_node3 29922 1726853667.14777: done getting next task for host managed_node3 29922 1726853667.14782: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 29922 1726853667.14786: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853667.14803: getting variables 29922 1726853667.14805: in VariableManager get_vars() 29922 1726853667.14842: Calling all_inventory to load vars for managed_node3 29922 1726853667.14844: Calling groups_inventory to load vars for managed_node3 29922 1726853667.14846: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853667.14858: Calling all_plugins_play to load vars for managed_node3 29922 1726853667.14860: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853667.14863: Calling groups_plugins_play to load vars for managed_node3 29922 1726853667.18081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853667.22221: done with get_vars() 29922 1726853667.22262: done getting variables 29922 1726853667.22390: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:34:27 -0400 (0:00:00.113) 0:00:16.153 ****** 29922 1726853667.22430: entering _queue_task() for managed_node3/service 29922 1726853667.22432: Creating lock for service 29922 1726853667.23105: worker is 1 (out of 1 available) 29922 1726853667.23117: exiting _queue_task() for managed_node3/service 29922 1726853667.23129: done queuing things up, now waiting for results queue to drain 29922 1726853667.23130: waiting for pending results... 29922 1726853667.23354: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 29922 1726853667.23886: in run() - task 02083763-bbaf-51d4-513b-000000000022 29922 1726853667.23890: variable 'ansible_search_path' from source: unknown 29922 1726853667.23893: variable 'ansible_search_path' from source: unknown 29922 1726853667.23896: calling self._execute() 29922 1726853667.24013: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853667.24027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853667.24041: variable 'omit' from source: magic vars 29922 1726853667.25563: variable 'ansible_distribution_major_version' from source: facts 29922 1726853667.25567: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853667.25719: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853667.26350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853667.29075: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853667.29129: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853667.29165: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853667.29193: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853667.29212: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853667.29281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853667.29301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853667.29319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853667.29344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853667.29355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853667.29421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853667.29429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853667.29447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853667.29481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853667.29493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853667.29521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853667.29563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853667.29781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853667.29784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853667.29786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853667.29817: variable 'network_connections' from source: task vars 29922 1726853667.29838: variable 'interface' from source: set_fact 29922 1726853667.29925: variable 'interface' from source: set_fact 29922 1726853667.29940: variable 'interface' from source: set_fact 29922 1726853667.30015: variable 'interface' from source: set_fact 29922 1726853667.30126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853667.30314: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853667.30375: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853667.30411: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853667.30453: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853667.30503: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853667.30532: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853667.30570: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853667.30603: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853667.30675: variable '__network_team_connections_defined' from source: role '' defaults 29922 1726853667.30922: variable 'network_connections' from source: task vars 29922 1726853667.30933: variable 'interface' from source: set_fact 29922 1726853667.31003: variable 'interface' from source: set_fact 29922 1726853667.31087: variable 'interface' from source: set_fact 29922 1726853667.31091: variable 'interface' from source: set_fact 29922 1726853667.31139: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29922 1726853667.31147: when evaluation is False, skipping this task 29922 1726853667.31154: _execute() done 29922 1726853667.31160: dumping result to json 29922 1726853667.31167: done dumping result, returning 29922 1726853667.31184: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-51d4-513b-000000000022] 29922 1726853667.31209: sending task result for task 02083763-bbaf-51d4-513b-000000000022 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29922 1726853667.31438: no more pending results, returning what we have 29922 1726853667.31442: results queue empty 29922 1726853667.31443: checking for any_errors_fatal 29922 1726853667.31452: done checking for any_errors_fatal 29922 1726853667.31453: checking for max_fail_percentage 29922 1726853667.31455: done checking for max_fail_percentage 29922 1726853667.31455: checking to see if all hosts have failed and the running result is not ok 29922 1726853667.31456: done checking to see if all hosts have failed 29922 1726853667.31457: getting the remaining hosts for this loop 29922 1726853667.31459: done getting the remaining hosts for this loop 29922 1726853667.31462: getting the next task for host managed_node3 29922 1726853667.31468: done getting next task for host managed_node3 29922 1726853667.31474: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 29922 1726853667.31477: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853667.31493: getting variables 29922 1726853667.31494: in VariableManager get_vars() 29922 1726853667.31535: Calling all_inventory to load vars for managed_node3 29922 1726853667.31538: Calling groups_inventory to load vars for managed_node3 29922 1726853667.31540: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853667.31552: Calling all_plugins_play to load vars for managed_node3 29922 1726853667.31555: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853667.31559: Calling groups_plugins_play to load vars for managed_node3 29922 1726853667.32187: done sending task result for task 02083763-bbaf-51d4-513b-000000000022 29922 1726853667.32191: WORKER PROCESS EXITING 29922 1726853667.33249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853667.34894: done with get_vars() 29922 1726853667.34925: done getting variables 29922 1726853667.35096: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:34:27 -0400 (0:00:00.126) 0:00:16.280 ****** 29922 1726853667.35128: entering _queue_task() for managed_node3/service 29922 1726853667.35680: worker is 1 (out of 1 available) 29922 1726853667.35691: exiting _queue_task() for managed_node3/service 29922 1726853667.35701: done queuing things up, now waiting for results queue to drain 29922 1726853667.35703: waiting for pending results... 29922 1726853667.35792: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 29922 1726853667.35934: in run() - task 02083763-bbaf-51d4-513b-000000000023 29922 1726853667.35956: variable 'ansible_search_path' from source: unknown 29922 1726853667.35964: variable 'ansible_search_path' from source: unknown 29922 1726853667.36006: calling self._execute() 29922 1726853667.36116: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853667.36129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853667.36153: variable 'omit' from source: magic vars 29922 1726853667.36555: variable 'ansible_distribution_major_version' from source: facts 29922 1726853667.36575: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853667.36751: variable 'network_provider' from source: set_fact 29922 1726853667.36763: variable 'network_state' from source: role '' defaults 29922 1726853667.36803: Evaluated conditional (network_provider == "nm" or network_state != {}): True 29922 1726853667.36807: variable 'omit' from source: magic vars 29922 1726853667.36912: variable 'omit' from source: magic vars 29922 1726853667.36915: variable 'network_service_name' from source: role '' defaults 29922 1726853667.36974: variable 'network_service_name' from source: role '' defaults 29922 1726853667.37087: variable '__network_provider_setup' from source: role '' defaults 29922 1726853667.37100: variable '__network_service_name_default_nm' from source: role '' defaults 29922 1726853667.37173: variable '__network_service_name_default_nm' from source: role '' defaults 29922 1726853667.37188: variable '__network_packages_default_nm' from source: role '' defaults 29922 1726853667.37348: variable '__network_packages_default_nm' from source: role '' defaults 29922 1726853667.37499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853667.41527: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853667.41608: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853667.41668: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853667.41716: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853667.41747: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853667.41843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853667.41884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853667.41918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853667.41961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853667.41986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853667.42041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853667.42069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853667.42120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853667.42156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853667.42200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853667.42452: variable '__network_packages_default_gobject_packages' from source: role '' defaults 29922 1726853667.42587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853667.42636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853667.42650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853667.42700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853667.42744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853667.42824: variable 'ansible_python' from source: facts 29922 1726853667.42856: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 29922 1726853667.42949: variable '__network_wpa_supplicant_required' from source: role '' defaults 29922 1726853667.43042: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29922 1726853667.43308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853667.43315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853667.43347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853667.43395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853667.43419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853667.43469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853667.43511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853667.43611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853667.43614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853667.43616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853667.44088: variable 'network_connections' from source: task vars 29922 1726853667.44091: variable 'interface' from source: set_fact 29922 1726853667.44093: variable 'interface' from source: set_fact 29922 1726853667.44107: variable 'interface' from source: set_fact 29922 1726853667.44264: variable 'interface' from source: set_fact 29922 1726853667.44781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853667.45084: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853667.45140: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853667.45195: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853667.45238: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853667.45318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853667.45351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853667.45393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853667.45429: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853667.45482: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853667.45797: variable 'network_connections' from source: task vars 29922 1726853667.45808: variable 'interface' from source: set_fact 29922 1726853667.45890: variable 'interface' from source: set_fact 29922 1726853667.45906: variable 'interface' from source: set_fact 29922 1726853667.46046: variable 'interface' from source: set_fact 29922 1726853667.46441: variable '__network_packages_default_wireless' from source: role '' defaults 29922 1726853667.46697: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853667.47236: variable 'network_connections' from source: task vars 29922 1726853667.47255: variable 'interface' from source: set_fact 29922 1726853667.47330: variable 'interface' from source: set_fact 29922 1726853667.47355: variable 'interface' from source: set_fact 29922 1726853667.47427: variable 'interface' from source: set_fact 29922 1726853667.47567: variable '__network_packages_default_team' from source: role '' defaults 29922 1726853667.47574: variable '__network_team_connections_defined' from source: role '' defaults 29922 1726853667.47905: variable 'network_connections' from source: task vars 29922 1726853667.47916: variable 'interface' from source: set_fact 29922 1726853667.47995: variable 'interface' from source: set_fact 29922 1726853667.48006: variable 'interface' from source: set_fact 29922 1726853667.48076: variable 'interface' from source: set_fact 29922 1726853667.48169: variable '__network_service_name_default_initscripts' from source: role '' defaults 29922 1726853667.48240: variable '__network_service_name_default_initscripts' from source: role '' defaults 29922 1726853667.48251: variable '__network_packages_default_initscripts' from source: role '' defaults 29922 1726853667.48324: variable '__network_packages_default_initscripts' from source: role '' defaults 29922 1726853667.48543: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 29922 1726853667.48870: variable 'network_connections' from source: task vars 29922 1726853667.48875: variable 'interface' from source: set_fact 29922 1726853667.48916: variable 'interface' from source: set_fact 29922 1726853667.48922: variable 'interface' from source: set_fact 29922 1726853667.48968: variable 'interface' from source: set_fact 29922 1726853667.48997: variable 'ansible_distribution' from source: facts 29922 1726853667.49000: variable '__network_rh_distros' from source: role '' defaults 29922 1726853667.49005: variable 'ansible_distribution_major_version' from source: facts 29922 1726853667.49024: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 29922 1726853667.49139: variable 'ansible_distribution' from source: facts 29922 1726853667.49143: variable '__network_rh_distros' from source: role '' defaults 29922 1726853667.49153: variable 'ansible_distribution_major_version' from source: facts 29922 1726853667.49160: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 29922 1726853667.49278: variable 'ansible_distribution' from source: facts 29922 1726853667.49282: variable '__network_rh_distros' from source: role '' defaults 29922 1726853667.49294: variable 'ansible_distribution_major_version' from source: facts 29922 1726853667.49318: variable 'network_provider' from source: set_fact 29922 1726853667.49335: variable 'omit' from source: magic vars 29922 1726853667.49360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853667.49383: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853667.49401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853667.49413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853667.49422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853667.49446: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853667.49449: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853667.49451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853667.49576: Set connection var ansible_connection to ssh 29922 1726853667.49581: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853667.49583: Set connection var ansible_shell_executable to /bin/sh 29922 1726853667.49599: Set connection var ansible_pipelining to False 29922 1726853667.49609: Set connection var ansible_timeout to 10 29922 1726853667.49748: Set connection var ansible_shell_type to sh 29922 1726853667.49751: variable 'ansible_shell_executable' from source: unknown 29922 1726853667.49754: variable 'ansible_connection' from source: unknown 29922 1726853667.49759: variable 'ansible_module_compression' from source: unknown 29922 1726853667.49761: variable 'ansible_shell_type' from source: unknown 29922 1726853667.49762: variable 'ansible_shell_executable' from source: unknown 29922 1726853667.49764: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853667.49770: variable 'ansible_pipelining' from source: unknown 29922 1726853667.49773: variable 'ansible_timeout' from source: unknown 29922 1726853667.49775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853667.50077: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853667.50080: variable 'omit' from source: magic vars 29922 1726853667.50082: starting attempt loop 29922 1726853667.50087: running the handler 29922 1726853667.50160: variable 'ansible_facts' from source: unknown 29922 1726853667.50929: _low_level_execute_command(): starting 29922 1726853667.50936: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853667.51790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853667.51838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853667.51850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853667.51912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853667.52078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853667.53752: stdout chunk (state=3): >>>/root <<< 29922 1726853667.53853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853667.53882: stderr chunk (state=3): >>><<< 29922 1726853667.53888: stdout chunk (state=3): >>><<< 29922 1726853667.53908: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853667.53920: _low_level_execute_command(): starting 29922 1726853667.53931: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853667.5391276-30717-96015449461207 `" && echo ansible-tmp-1726853667.5391276-30717-96015449461207="` echo /root/.ansible/tmp/ansible-tmp-1726853667.5391276-30717-96015449461207 `" ) && sleep 0' 29922 1726853667.54356: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853667.54361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853667.54391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853667.54394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853667.54448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853667.54452: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853667.54457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853667.54520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853667.56512: stdout chunk (state=3): >>>ansible-tmp-1726853667.5391276-30717-96015449461207=/root/.ansible/tmp/ansible-tmp-1726853667.5391276-30717-96015449461207 <<< 29922 1726853667.56639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853667.56652: stderr chunk (state=3): >>><<< 29922 1726853667.56660: stdout chunk (state=3): >>><<< 29922 1726853667.56685: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853667.5391276-30717-96015449461207=/root/.ansible/tmp/ansible-tmp-1726853667.5391276-30717-96015449461207 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853667.56782: variable 'ansible_module_compression' from source: unknown 29922 1726853667.56836: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 29922 1726853667.56851: ANSIBALLZ: Acquiring lock 29922 1726853667.56861: ANSIBALLZ: Lock acquired: 140376041361328 29922 1726853667.56870: ANSIBALLZ: Creating module 29922 1726853668.06709: ANSIBALLZ: Writing module into payload 29922 1726853668.06858: ANSIBALLZ: Writing module 29922 1726853668.07089: ANSIBALLZ: Renaming module 29922 1726853668.07144: ANSIBALLZ: Done creating module 29922 1726853668.07147: variable 'ansible_facts' from source: unknown 29922 1726853668.07536: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853667.5391276-30717-96015449461207/AnsiballZ_systemd.py 29922 1726853668.07881: Sending initial data 29922 1726853668.07885: Sent initial data (155 bytes) 29922 1726853668.09278: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853668.09393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853668.09577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853668.11193: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853668.11254: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853668.11312: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpu5i2y0xs /root/.ansible/tmp/ansible-tmp-1726853667.5391276-30717-96015449461207/AnsiballZ_systemd.py <<< 29922 1726853668.11315: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853667.5391276-30717-96015449461207/AnsiballZ_systemd.py" <<< 29922 1726853668.11380: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpu5i2y0xs" to remote "/root/.ansible/tmp/ansible-tmp-1726853667.5391276-30717-96015449461207/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853667.5391276-30717-96015449461207/AnsiballZ_systemd.py" <<< 29922 1726853668.13595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853668.13632: stderr chunk (state=3): >>><<< 29922 1726853668.13635: stdout chunk (state=3): >>><<< 29922 1726853668.13674: done transferring module to remote 29922 1726853668.13685: _low_level_execute_command(): starting 29922 1726853668.13690: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853667.5391276-30717-96015449461207/ /root/.ansible/tmp/ansible-tmp-1726853667.5391276-30717-96015449461207/AnsiballZ_systemd.py && sleep 0' 29922 1726853668.14624: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853668.14635: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853668.14745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853668.14749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853668.14751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853668.14753: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853668.14755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853668.14757: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853668.14759: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853668.14761: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29922 1726853668.14763: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853668.14765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853668.14767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853668.14769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853668.14796: stderr chunk (state=3): >>>debug2: match found <<< 29922 1726853668.14799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853668.14906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853668.14914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853668.14917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853668.14999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853668.17281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853668.17286: stdout chunk (state=3): >>><<< 29922 1726853668.17294: stderr chunk (state=3): >>><<< 29922 1726853668.17354: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853668.17357: _low_level_execute_command(): starting 29922 1726853668.17377: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853667.5391276-30717-96015449461207/AnsiballZ_systemd.py && sleep 0' 29922 1726853668.18677: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853668.18686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853668.18701: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853668.18712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853668.18767: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 29922 1726853668.18770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853668.18775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853668.18819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853668.18825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853668.18849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853668.18985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853668.48367: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10694656", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3324571648", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "2063477000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 29922 1726853668.48379: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysini<<< 29922 1726853668.48400: stdout chunk (state=3): >>>t.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 29922 1726853668.50293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853668.50323: stderr chunk (state=3): >>><<< 29922 1726853668.50327: stdout chunk (state=3): >>><<< 29922 1726853668.50344: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10694656", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3324571648", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "2063477000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853668.50467: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853667.5391276-30717-96015449461207/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853668.50474: _low_level_execute_command(): starting 29922 1726853668.50481: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853667.5391276-30717-96015449461207/ > /dev/null 2>&1 && sleep 0' 29922 1726853668.50939: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853668.50942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853668.50945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29922 1726853668.50948: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853668.50950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853668.51004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853668.51007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853668.51009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853668.51075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853668.52919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853668.52945: stderr chunk (state=3): >>><<< 29922 1726853668.52948: stdout chunk (state=3): >>><<< 29922 1726853668.52961: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853668.52968: handler run complete 29922 1726853668.53012: attempt loop complete, returning result 29922 1726853668.53015: _execute() done 29922 1726853668.53017: dumping result to json 29922 1726853668.53029: done dumping result, returning 29922 1726853668.53039: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-51d4-513b-000000000023] 29922 1726853668.53041: sending task result for task 02083763-bbaf-51d4-513b-000000000023 29922 1726853668.53275: done sending task result for task 02083763-bbaf-51d4-513b-000000000023 29922 1726853668.53278: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29922 1726853668.53323: no more pending results, returning what we have 29922 1726853668.53326: results queue empty 29922 1726853668.53327: checking for any_errors_fatal 29922 1726853668.53333: done checking for any_errors_fatal 29922 1726853668.53334: checking for max_fail_percentage 29922 1726853668.53335: done checking for max_fail_percentage 29922 1726853668.53336: checking to see if all hosts have failed and the running result is not ok 29922 1726853668.53337: done checking to see if all hosts have failed 29922 1726853668.53337: getting the remaining hosts for this loop 29922 1726853668.53339: done getting the remaining hosts for this loop 29922 1726853668.53342: getting the next task for host managed_node3 29922 1726853668.53348: done getting next task for host managed_node3 29922 1726853668.53351: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 29922 1726853668.53353: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853668.53366: getting variables 29922 1726853668.53368: in VariableManager get_vars() 29922 1726853668.53401: Calling all_inventory to load vars for managed_node3 29922 1726853668.53403: Calling groups_inventory to load vars for managed_node3 29922 1726853668.53405: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853668.53414: Calling all_plugins_play to load vars for managed_node3 29922 1726853668.53417: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853668.53419: Calling groups_plugins_play to load vars for managed_node3 29922 1726853668.54339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853668.55203: done with get_vars() 29922 1726853668.55223: done getting variables 29922 1726853668.55266: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:34:28 -0400 (0:00:01.201) 0:00:17.482 ****** 29922 1726853668.55291: entering _queue_task() for managed_node3/service 29922 1726853668.55534: worker is 1 (out of 1 available) 29922 1726853668.55549: exiting _queue_task() for managed_node3/service 29922 1726853668.55561: done queuing things up, now waiting for results queue to drain 29922 1726853668.55562: waiting for pending results... 29922 1726853668.55741: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 29922 1726853668.55835: in run() - task 02083763-bbaf-51d4-513b-000000000024 29922 1726853668.55848: variable 'ansible_search_path' from source: unknown 29922 1726853668.55851: variable 'ansible_search_path' from source: unknown 29922 1726853668.55884: calling self._execute() 29922 1726853668.55961: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853668.55967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853668.55977: variable 'omit' from source: magic vars 29922 1726853668.56264: variable 'ansible_distribution_major_version' from source: facts 29922 1726853668.56274: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853668.56353: variable 'network_provider' from source: set_fact 29922 1726853668.56361: Evaluated conditional (network_provider == "nm"): True 29922 1726853668.56423: variable '__network_wpa_supplicant_required' from source: role '' defaults 29922 1726853668.56489: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29922 1726853668.56607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853668.58042: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853668.58091: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853668.58118: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853668.58142: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853668.58164: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853668.58236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853668.58256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853668.58277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853668.58309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853668.58319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853668.58352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853668.58372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853668.58390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853668.58419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853668.58429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853668.58457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853668.58475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853668.58492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853668.58520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853668.58531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853668.58624: variable 'network_connections' from source: task vars 29922 1726853668.58633: variable 'interface' from source: set_fact 29922 1726853668.58687: variable 'interface' from source: set_fact 29922 1726853668.58695: variable 'interface' from source: set_fact 29922 1726853668.58739: variable 'interface' from source: set_fact 29922 1726853668.58806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853668.58918: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853668.58951: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853668.58970: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853668.58993: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853668.59023: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853668.59039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853668.59062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853668.59082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853668.59120: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853668.59282: variable 'network_connections' from source: task vars 29922 1726853668.59286: variable 'interface' from source: set_fact 29922 1726853668.59328: variable 'interface' from source: set_fact 29922 1726853668.59332: variable 'interface' from source: set_fact 29922 1726853668.59380: variable 'interface' from source: set_fact 29922 1726853668.59421: Evaluated conditional (__network_wpa_supplicant_required): False 29922 1726853668.59424: when evaluation is False, skipping this task 29922 1726853668.59427: _execute() done 29922 1726853668.59440: dumping result to json 29922 1726853668.59443: done dumping result, returning 29922 1726853668.59445: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-51d4-513b-000000000024] 29922 1726853668.59447: sending task result for task 02083763-bbaf-51d4-513b-000000000024 29922 1726853668.59535: done sending task result for task 02083763-bbaf-51d4-513b-000000000024 29922 1726853668.59538: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 29922 1726853668.59587: no more pending results, returning what we have 29922 1726853668.59591: results queue empty 29922 1726853668.59592: checking for any_errors_fatal 29922 1726853668.59615: done checking for any_errors_fatal 29922 1726853668.59616: checking for max_fail_percentage 29922 1726853668.59617: done checking for max_fail_percentage 29922 1726853668.59618: checking to see if all hosts have failed and the running result is not ok 29922 1726853668.59619: done checking to see if all hosts have failed 29922 1726853668.59619: getting the remaining hosts for this loop 29922 1726853668.59621: done getting the remaining hosts for this loop 29922 1726853668.59624: getting the next task for host managed_node3 29922 1726853668.59630: done getting next task for host managed_node3 29922 1726853668.59633: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 29922 1726853668.59635: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853668.59652: getting variables 29922 1726853668.59653: in VariableManager get_vars() 29922 1726853668.59691: Calling all_inventory to load vars for managed_node3 29922 1726853668.59693: Calling groups_inventory to load vars for managed_node3 29922 1726853668.59695: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853668.59705: Calling all_plugins_play to load vars for managed_node3 29922 1726853668.59708: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853668.59710: Calling groups_plugins_play to load vars for managed_node3 29922 1726853668.60523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853668.61490: done with get_vars() 29922 1726853668.61506: done getting variables 29922 1726853668.61550: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:34:28 -0400 (0:00:00.062) 0:00:17.545 ****** 29922 1726853668.61576: entering _queue_task() for managed_node3/service 29922 1726853668.61815: worker is 1 (out of 1 available) 29922 1726853668.61829: exiting _queue_task() for managed_node3/service 29922 1726853668.61842: done queuing things up, now waiting for results queue to drain 29922 1726853668.61844: waiting for pending results... 29922 1726853668.62022: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 29922 1726853668.62117: in run() - task 02083763-bbaf-51d4-513b-000000000025 29922 1726853668.62129: variable 'ansible_search_path' from source: unknown 29922 1726853668.62133: variable 'ansible_search_path' from source: unknown 29922 1726853668.62160: calling self._execute() 29922 1726853668.62240: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853668.62243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853668.62253: variable 'omit' from source: magic vars 29922 1726853668.62532: variable 'ansible_distribution_major_version' from source: facts 29922 1726853668.62541: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853668.62624: variable 'network_provider' from source: set_fact 29922 1726853668.62628: Evaluated conditional (network_provider == "initscripts"): False 29922 1726853668.62630: when evaluation is False, skipping this task 29922 1726853668.62634: _execute() done 29922 1726853668.62636: dumping result to json 29922 1726853668.62639: done dumping result, returning 29922 1726853668.62645: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-51d4-513b-000000000025] 29922 1726853668.62649: sending task result for task 02083763-bbaf-51d4-513b-000000000025 29922 1726853668.62733: done sending task result for task 02083763-bbaf-51d4-513b-000000000025 29922 1726853668.62736: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29922 1726853668.62780: no more pending results, returning what we have 29922 1726853668.62785: results queue empty 29922 1726853668.62785: checking for any_errors_fatal 29922 1726853668.62795: done checking for any_errors_fatal 29922 1726853668.62795: checking for max_fail_percentage 29922 1726853668.62797: done checking for max_fail_percentage 29922 1726853668.62798: checking to see if all hosts have failed and the running result is not ok 29922 1726853668.62798: done checking to see if all hosts have failed 29922 1726853668.62799: getting the remaining hosts for this loop 29922 1726853668.62800: done getting the remaining hosts for this loop 29922 1726853668.62804: getting the next task for host managed_node3 29922 1726853668.62810: done getting next task for host managed_node3 29922 1726853668.62813: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 29922 1726853668.62817: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853668.62832: getting variables 29922 1726853668.62833: in VariableManager get_vars() 29922 1726853668.62868: Calling all_inventory to load vars for managed_node3 29922 1726853668.62880: Calling groups_inventory to load vars for managed_node3 29922 1726853668.62882: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853668.62891: Calling all_plugins_play to load vars for managed_node3 29922 1726853668.62893: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853668.62895: Calling groups_plugins_play to load vars for managed_node3 29922 1726853668.63669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853668.64537: done with get_vars() 29922 1726853668.64553: done getting variables 29922 1726853668.64596: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:34:28 -0400 (0:00:00.030) 0:00:17.575 ****** 29922 1726853668.64621: entering _queue_task() for managed_node3/copy 29922 1726853668.64846: worker is 1 (out of 1 available) 29922 1726853668.64859: exiting _queue_task() for managed_node3/copy 29922 1726853668.64873: done queuing things up, now waiting for results queue to drain 29922 1726853668.64874: waiting for pending results... 29922 1726853668.65055: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 29922 1726853668.65156: in run() - task 02083763-bbaf-51d4-513b-000000000026 29922 1726853668.65170: variable 'ansible_search_path' from source: unknown 29922 1726853668.65176: variable 'ansible_search_path' from source: unknown 29922 1726853668.65206: calling self._execute() 29922 1726853668.65279: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853668.65285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853668.65293: variable 'omit' from source: magic vars 29922 1726853668.65576: variable 'ansible_distribution_major_version' from source: facts 29922 1726853668.65585: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853668.65666: variable 'network_provider' from source: set_fact 29922 1726853668.65669: Evaluated conditional (network_provider == "initscripts"): False 29922 1726853668.65674: when evaluation is False, skipping this task 29922 1726853668.65676: _execute() done 29922 1726853668.65679: dumping result to json 29922 1726853668.65682: done dumping result, returning 29922 1726853668.65689: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-51d4-513b-000000000026] 29922 1726853668.65692: sending task result for task 02083763-bbaf-51d4-513b-000000000026 29922 1726853668.65777: done sending task result for task 02083763-bbaf-51d4-513b-000000000026 29922 1726853668.65780: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 29922 1726853668.65825: no more pending results, returning what we have 29922 1726853668.65829: results queue empty 29922 1726853668.65830: checking for any_errors_fatal 29922 1726853668.65838: done checking for any_errors_fatal 29922 1726853668.65839: checking for max_fail_percentage 29922 1726853668.65840: done checking for max_fail_percentage 29922 1726853668.65841: checking to see if all hosts have failed and the running result is not ok 29922 1726853668.65842: done checking to see if all hosts have failed 29922 1726853668.65842: getting the remaining hosts for this loop 29922 1726853668.65844: done getting the remaining hosts for this loop 29922 1726853668.65847: getting the next task for host managed_node3 29922 1726853668.65853: done getting next task for host managed_node3 29922 1726853668.65856: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 29922 1726853668.65859: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853668.65876: getting variables 29922 1726853668.65877: in VariableManager get_vars() 29922 1726853668.65908: Calling all_inventory to load vars for managed_node3 29922 1726853668.65910: Calling groups_inventory to load vars for managed_node3 29922 1726853668.65912: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853668.65920: Calling all_plugins_play to load vars for managed_node3 29922 1726853668.65922: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853668.65925: Calling groups_plugins_play to load vars for managed_node3 29922 1726853668.66809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853668.67668: done with get_vars() 29922 1726853668.67685: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:34:28 -0400 (0:00:00.031) 0:00:17.607 ****** 29922 1726853668.67745: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 29922 1726853668.67746: Creating lock for fedora.linux_system_roles.network_connections 29922 1726853668.67978: worker is 1 (out of 1 available) 29922 1726853668.67993: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 29922 1726853668.68005: done queuing things up, now waiting for results queue to drain 29922 1726853668.68006: waiting for pending results... 29922 1726853668.68187: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 29922 1726853668.68277: in run() - task 02083763-bbaf-51d4-513b-000000000027 29922 1726853668.68289: variable 'ansible_search_path' from source: unknown 29922 1726853668.68293: variable 'ansible_search_path' from source: unknown 29922 1726853668.68320: calling self._execute() 29922 1726853668.68393: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853668.68397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853668.68407: variable 'omit' from source: magic vars 29922 1726853668.68689: variable 'ansible_distribution_major_version' from source: facts 29922 1726853668.68698: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853668.68701: variable 'omit' from source: magic vars 29922 1726853668.68738: variable 'omit' from source: magic vars 29922 1726853668.68849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853668.70285: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853668.70332: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853668.70360: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853668.70390: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853668.70412: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853668.70476: variable 'network_provider' from source: set_fact 29922 1726853668.70574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853668.70605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853668.70624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853668.70655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853668.70669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853668.70724: variable 'omit' from source: magic vars 29922 1726853668.70804: variable 'omit' from source: magic vars 29922 1726853668.70876: variable 'network_connections' from source: task vars 29922 1726853668.70887: variable 'interface' from source: set_fact 29922 1726853668.70932: variable 'interface' from source: set_fact 29922 1726853668.70938: variable 'interface' from source: set_fact 29922 1726853668.70986: variable 'interface' from source: set_fact 29922 1726853668.71208: variable 'omit' from source: magic vars 29922 1726853668.71215: variable '__lsr_ansible_managed' from source: task vars 29922 1726853668.71255: variable '__lsr_ansible_managed' from source: task vars 29922 1726853668.71442: Loaded config def from plugin (lookup/template) 29922 1726853668.71446: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 29922 1726853668.71469: File lookup term: get_ansible_managed.j2 29922 1726853668.71474: variable 'ansible_search_path' from source: unknown 29922 1726853668.71477: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 29922 1726853668.71490: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 29922 1726853668.71505: variable 'ansible_search_path' from source: unknown 29922 1726853668.74922: variable 'ansible_managed' from source: unknown 29922 1726853668.75010: variable 'omit' from source: magic vars 29922 1726853668.75032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853668.75057: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853668.75078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853668.75090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853668.75099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853668.75121: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853668.75124: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853668.75127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853668.75195: Set connection var ansible_connection to ssh 29922 1726853668.75201: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853668.75208: Set connection var ansible_shell_executable to /bin/sh 29922 1726853668.75214: Set connection var ansible_pipelining to False 29922 1726853668.75219: Set connection var ansible_timeout to 10 29922 1726853668.75222: Set connection var ansible_shell_type to sh 29922 1726853668.75240: variable 'ansible_shell_executable' from source: unknown 29922 1726853668.75243: variable 'ansible_connection' from source: unknown 29922 1726853668.75246: variable 'ansible_module_compression' from source: unknown 29922 1726853668.75248: variable 'ansible_shell_type' from source: unknown 29922 1726853668.75250: variable 'ansible_shell_executable' from source: unknown 29922 1726853668.75254: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853668.75256: variable 'ansible_pipelining' from source: unknown 29922 1726853668.75274: variable 'ansible_timeout' from source: unknown 29922 1726853668.75277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853668.75356: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853668.75369: variable 'omit' from source: magic vars 29922 1726853668.75381: starting attempt loop 29922 1726853668.75384: running the handler 29922 1726853668.75396: _low_level_execute_command(): starting 29922 1726853668.75404: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853668.75906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853668.75909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853668.75912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853668.75916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853668.75919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853668.75963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853668.75967: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853668.75987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853668.76049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853668.77726: stdout chunk (state=3): >>>/root <<< 29922 1726853668.77829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853668.77862: stderr chunk (state=3): >>><<< 29922 1726853668.77865: stdout chunk (state=3): >>><<< 29922 1726853668.77886: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853668.77897: _low_level_execute_command(): starting 29922 1726853668.77904: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853668.7788675-30776-22255869183236 `" && echo ansible-tmp-1726853668.7788675-30776-22255869183236="` echo /root/.ansible/tmp/ansible-tmp-1726853668.7788675-30776-22255869183236 `" ) && sleep 0' 29922 1726853668.78344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853668.78347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853668.78350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853668.78352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853668.78357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853668.78359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853668.78412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853668.78416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853668.78418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853668.78470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853668.80404: stdout chunk (state=3): >>>ansible-tmp-1726853668.7788675-30776-22255869183236=/root/.ansible/tmp/ansible-tmp-1726853668.7788675-30776-22255869183236 <<< 29922 1726853668.80520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853668.80540: stderr chunk (state=3): >>><<< 29922 1726853668.80543: stdout chunk (state=3): >>><<< 29922 1726853668.80559: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853668.7788675-30776-22255869183236=/root/.ansible/tmp/ansible-tmp-1726853668.7788675-30776-22255869183236 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853668.80597: variable 'ansible_module_compression' from source: unknown 29922 1726853668.80641: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 29922 1726853668.80644: ANSIBALLZ: Acquiring lock 29922 1726853668.80647: ANSIBALLZ: Lock acquired: 140376041478176 29922 1726853668.80649: ANSIBALLZ: Creating module 29922 1726853668.93254: ANSIBALLZ: Writing module into payload 29922 1726853668.93485: ANSIBALLZ: Writing module 29922 1726853668.93507: ANSIBALLZ: Renaming module 29922 1726853668.93512: ANSIBALLZ: Done creating module 29922 1726853668.93533: variable 'ansible_facts' from source: unknown 29922 1726853668.93606: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853668.7788675-30776-22255869183236/AnsiballZ_network_connections.py 29922 1726853668.93713: Sending initial data 29922 1726853668.93716: Sent initial data (167 bytes) 29922 1726853668.94161: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853668.94197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853668.94200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853668.94202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853668.94204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853668.94293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853668.94297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853668.94338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853668.94405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853668.96079: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 29922 1726853668.96083: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853668.96137: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853668.96221: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpk0z7duk6 /root/.ansible/tmp/ansible-tmp-1726853668.7788675-30776-22255869183236/AnsiballZ_network_connections.py <<< 29922 1726853668.96225: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853668.7788675-30776-22255869183236/AnsiballZ_network_connections.py" <<< 29922 1726853668.96296: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpk0z7duk6" to remote "/root/.ansible/tmp/ansible-tmp-1726853668.7788675-30776-22255869183236/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853668.7788675-30776-22255869183236/AnsiballZ_network_connections.py" <<< 29922 1726853668.97460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853668.97476: stdout chunk (state=3): >>><<< 29922 1726853668.97482: stderr chunk (state=3): >>><<< 29922 1726853668.97521: done transferring module to remote 29922 1726853668.97527: _low_level_execute_command(): starting 29922 1726853668.97531: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853668.7788675-30776-22255869183236/ /root/.ansible/tmp/ansible-tmp-1726853668.7788675-30776-22255869183236/AnsiballZ_network_connections.py && sleep 0' 29922 1726853668.97945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853668.97978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853668.97981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853668.97983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 29922 1726853668.97985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853668.97987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853668.98030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853668.98040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853668.98114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853669.00028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853669.00031: stdout chunk (state=3): >>><<< 29922 1726853669.00034: stderr chunk (state=3): >>><<< 29922 1726853669.00049: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853669.00122: _low_level_execute_command(): starting 29922 1726853669.00125: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853668.7788675-30776-22255869183236/AnsiballZ_network_connections.py && sleep 0' 29922 1726853669.00651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853669.00690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853669.00740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853669.00807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853669.00825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853669.00858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853669.00967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853669.45323: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 0bb5d45f-2ba9-4c52-9232-d0cdb613594b\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 0bb5d45f-2ba9-4c52-9232-d0cdb613594b (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 29922 1726853669.47537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853669.47541: stdout chunk (state=3): >>><<< 29922 1726853669.47544: stderr chunk (state=3): >>><<< 29922 1726853669.47574: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 0bb5d45f-2ba9-4c52-9232-d0cdb613594b\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 0bb5d45f-2ba9-4c52-9232-d0cdb613594b (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853669.47757: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'dhcp4': False, 'address': ['198.51.100.3/26', '2001:db8::2/32'], 'route': [{'network': '198.51.100.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4, 'table': 30200}, {'network': '198.51.100.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2, 'table': 30400}, {'network': '2001:db8::4', 'prefix': 32, 'gateway': '2001:db8::1', 'metric': 2, 'table': 30600}], 'routing_rule': [{'priority': 30200, 'from': '198.51.100.58/26', 'table': 30200}, {'priority': 30201, 'family': 'ipv4', 'fwmark': 1, 'fwmask': 1, 'table': 30200}, {'priority': 30202, 'family': 'ipv4', 'ipproto': 6, 'table': 30200}, {'priority': 30203, 'family': 'ipv4', 'sport': '128 - 256', 'table': 30200}, {'priority': 30204, 'family': 'ipv4', 'tos': 8, 'table': 30200}, {'priority': 30400, 'to': '198.51.100.128/26', 'table': 30400}, {'priority': 30401, 'family': 'ipv4', 'iif': 'iiftest', 'table': 30400}, {'priority': 30402, 'family': 'ipv4', 'oif': 'oiftest', 'table': 30400}, {'priority': 30403, 'from': '0.0.0.0/0', 'to': '0.0.0.0/0', 'table': 30400}, {'priority': 30600, 'to': '2001:db8::4/32', 'table': 30600}, {'priority': 30601, 'family': 'ipv6', 'dport': '128 - 256', 'invert': True, 'table': 30600}, {'priority': 30602, 'from': '::/0', 'to': '::/0', 'table': 30600}, {'priority': 200, 'from': '198.51.100.56/26', 'table': 'custom'}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853668.7788675-30776-22255869183236/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853669.47777: _low_level_execute_command(): starting 29922 1726853669.47780: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853668.7788675-30776-22255869183236/ > /dev/null 2>&1 && sleep 0' 29922 1726853669.48461: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853669.48479: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853669.48494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853669.48511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853669.48540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853669.48652: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853669.48678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853669.48778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853669.50776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853669.50792: stdout chunk (state=3): >>><<< 29922 1726853669.50812: stderr chunk (state=3): >>><<< 29922 1726853669.50874: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853669.50889: handler run complete 29922 1726853669.51077: attempt loop complete, returning result 29922 1726853669.51080: _execute() done 29922 1726853669.51082: dumping result to json 29922 1726853669.51084: done dumping result, returning 29922 1726853669.51087: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-51d4-513b-000000000027] 29922 1726853669.51094: sending task result for task 02083763-bbaf-51d4-513b-000000000027 29922 1726853669.51574: done sending task result for task 02083763-bbaf-51d4-513b-000000000027 29922 1726853669.51577: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26", "2001:db8::2/32" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db8::4", "prefix": 32, "table": 30600 } ], "routing_rule": [ { "from": "198.51.100.58/26", "priority": 30200, "table": 30200 }, { "family": "ipv4", "fwmark": 1, "fwmask": 1, "priority": 30201, "table": 30200 }, { "family": "ipv4", "ipproto": 6, "priority": 30202, "table": 30200 }, { "family": "ipv4", "priority": 30203, "sport": "128 - 256", "table": 30200 }, { "family": "ipv4", "priority": 30204, "table": 30200, "tos": 8 }, { "priority": 30400, "table": 30400, "to": "198.51.100.128/26" }, { "family": "ipv4", "iif": "iiftest", "priority": 30401, "table": 30400 }, { "family": "ipv4", "oif": "oiftest", "priority": 30402, "table": 30400 }, { "from": "0.0.0.0/0", "priority": 30403, "table": 30400, "to": "0.0.0.0/0" }, { "priority": 30600, "table": 30600, "to": "2001:db8::4/32" }, { "dport": "128 - 256", "family": "ipv6", "invert": true, "priority": 30601, "table": 30600 }, { "from": "::/0", "priority": 30602, "table": 30600, "to": "::/0" }, { "from": "198.51.100.56/26", "priority": 200, "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 0bb5d45f-2ba9-4c52-9232-d0cdb613594b [004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 0bb5d45f-2ba9-4c52-9232-d0cdb613594b (not-active) 29922 1726853669.52034: no more pending results, returning what we have 29922 1726853669.52038: results queue empty 29922 1726853669.52039: checking for any_errors_fatal 29922 1726853669.52044: done checking for any_errors_fatal 29922 1726853669.52045: checking for max_fail_percentage 29922 1726853669.52046: done checking for max_fail_percentage 29922 1726853669.52047: checking to see if all hosts have failed and the running result is not ok 29922 1726853669.52048: done checking to see if all hosts have failed 29922 1726853669.52049: getting the remaining hosts for this loop 29922 1726853669.52050: done getting the remaining hosts for this loop 29922 1726853669.52054: getting the next task for host managed_node3 29922 1726853669.52063: done getting next task for host managed_node3 29922 1726853669.52067: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 29922 1726853669.52069: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853669.52145: getting variables 29922 1726853669.52146: in VariableManager get_vars() 29922 1726853669.52182: Calling all_inventory to load vars for managed_node3 29922 1726853669.52184: Calling groups_inventory to load vars for managed_node3 29922 1726853669.52187: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853669.52196: Calling all_plugins_play to load vars for managed_node3 29922 1726853669.52199: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853669.52428: Calling groups_plugins_play to load vars for managed_node3 29922 1726853669.53873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853669.55530: done with get_vars() 29922 1726853669.55563: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:34:29 -0400 (0:00:00.879) 0:00:18.486 ****** 29922 1726853669.55662: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 29922 1726853669.55664: Creating lock for fedora.linux_system_roles.network_state 29922 1726853669.56204: worker is 1 (out of 1 available) 29922 1726853669.56216: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 29922 1726853669.56227: done queuing things up, now waiting for results queue to drain 29922 1726853669.56228: waiting for pending results... 29922 1726853669.56476: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 29922 1726853669.56532: in run() - task 02083763-bbaf-51d4-513b-000000000028 29922 1726853669.56573: variable 'ansible_search_path' from source: unknown 29922 1726853669.56583: variable 'ansible_search_path' from source: unknown 29922 1726853669.56627: calling self._execute() 29922 1726853669.56784: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853669.56791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853669.56795: variable 'omit' from source: magic vars 29922 1726853669.57242: variable 'ansible_distribution_major_version' from source: facts 29922 1726853669.57263: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853669.57436: variable 'network_state' from source: role '' defaults 29922 1726853669.57444: Evaluated conditional (network_state != {}): False 29922 1726853669.57447: when evaluation is False, skipping this task 29922 1726853669.57450: _execute() done 29922 1726853669.57452: dumping result to json 29922 1726853669.57462: done dumping result, returning 29922 1726853669.57541: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-51d4-513b-000000000028] 29922 1726853669.57545: sending task result for task 02083763-bbaf-51d4-513b-000000000028 29922 1726853669.57624: done sending task result for task 02083763-bbaf-51d4-513b-000000000028 29922 1726853669.57627: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29922 1726853669.57707: no more pending results, returning what we have 29922 1726853669.57711: results queue empty 29922 1726853669.57711: checking for any_errors_fatal 29922 1726853669.57737: done checking for any_errors_fatal 29922 1726853669.57737: checking for max_fail_percentage 29922 1726853669.57739: done checking for max_fail_percentage 29922 1726853669.57740: checking to see if all hosts have failed and the running result is not ok 29922 1726853669.57741: done checking to see if all hosts have failed 29922 1726853669.57741: getting the remaining hosts for this loop 29922 1726853669.57742: done getting the remaining hosts for this loop 29922 1726853669.57746: getting the next task for host managed_node3 29922 1726853669.57761: done getting next task for host managed_node3 29922 1726853669.57765: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 29922 1726853669.57768: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853669.57788: getting variables 29922 1726853669.57790: in VariableManager get_vars() 29922 1726853669.57830: Calling all_inventory to load vars for managed_node3 29922 1726853669.57834: Calling groups_inventory to load vars for managed_node3 29922 1726853669.57837: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853669.57849: Calling all_plugins_play to load vars for managed_node3 29922 1726853669.57852: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853669.57858: Calling groups_plugins_play to load vars for managed_node3 29922 1726853669.59638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853669.61335: done with get_vars() 29922 1726853669.61364: done getting variables 29922 1726853669.61437: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:34:29 -0400 (0:00:00.058) 0:00:18.544 ****** 29922 1726853669.61476: entering _queue_task() for managed_node3/debug 29922 1726853669.61838: worker is 1 (out of 1 available) 29922 1726853669.61851: exiting _queue_task() for managed_node3/debug 29922 1726853669.61982: done queuing things up, now waiting for results queue to drain 29922 1726853669.61984: waiting for pending results... 29922 1726853669.62201: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 29922 1726853669.62311: in run() - task 02083763-bbaf-51d4-513b-000000000029 29922 1726853669.62326: variable 'ansible_search_path' from source: unknown 29922 1726853669.62330: variable 'ansible_search_path' from source: unknown 29922 1726853669.62361: calling self._execute() 29922 1726853669.62435: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853669.62439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853669.62450: variable 'omit' from source: magic vars 29922 1726853669.62738: variable 'ansible_distribution_major_version' from source: facts 29922 1726853669.62749: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853669.62752: variable 'omit' from source: magic vars 29922 1726853669.62795: variable 'omit' from source: magic vars 29922 1726853669.62819: variable 'omit' from source: magic vars 29922 1726853669.62851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853669.62880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853669.62896: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853669.62909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853669.62918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853669.62946: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853669.62949: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853669.62951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853669.63022: Set connection var ansible_connection to ssh 29922 1726853669.63028: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853669.63037: Set connection var ansible_shell_executable to /bin/sh 29922 1726853669.63043: Set connection var ansible_pipelining to False 29922 1726853669.63048: Set connection var ansible_timeout to 10 29922 1726853669.63051: Set connection var ansible_shell_type to sh 29922 1726853669.63070: variable 'ansible_shell_executable' from source: unknown 29922 1726853669.63075: variable 'ansible_connection' from source: unknown 29922 1726853669.63078: variable 'ansible_module_compression' from source: unknown 29922 1726853669.63081: variable 'ansible_shell_type' from source: unknown 29922 1726853669.63084: variable 'ansible_shell_executable' from source: unknown 29922 1726853669.63086: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853669.63088: variable 'ansible_pipelining' from source: unknown 29922 1726853669.63090: variable 'ansible_timeout' from source: unknown 29922 1726853669.63099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853669.63194: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853669.63206: variable 'omit' from source: magic vars 29922 1726853669.63209: starting attempt loop 29922 1726853669.63212: running the handler 29922 1726853669.63302: variable '__network_connections_result' from source: set_fact 29922 1726853669.63357: handler run complete 29922 1726853669.63368: attempt loop complete, returning result 29922 1726853669.63373: _execute() done 29922 1726853669.63376: dumping result to json 29922 1726853669.63378: done dumping result, returning 29922 1726853669.63386: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-51d4-513b-000000000029] 29922 1726853669.63389: sending task result for task 02083763-bbaf-51d4-513b-000000000029 29922 1726853669.63473: done sending task result for task 02083763-bbaf-51d4-513b-000000000029 29922 1726853669.63476: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 0bb5d45f-2ba9-4c52-9232-d0cdb613594b", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 0bb5d45f-2ba9-4c52-9232-d0cdb613594b (not-active)" ] } 29922 1726853669.63536: no more pending results, returning what we have 29922 1726853669.63539: results queue empty 29922 1726853669.63540: checking for any_errors_fatal 29922 1726853669.63547: done checking for any_errors_fatal 29922 1726853669.63547: checking for max_fail_percentage 29922 1726853669.63549: done checking for max_fail_percentage 29922 1726853669.63550: checking to see if all hosts have failed and the running result is not ok 29922 1726853669.63551: done checking to see if all hosts have failed 29922 1726853669.63551: getting the remaining hosts for this loop 29922 1726853669.63553: done getting the remaining hosts for this loop 29922 1726853669.63559: getting the next task for host managed_node3 29922 1726853669.63564: done getting next task for host managed_node3 29922 1726853669.63567: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 29922 1726853669.63570: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853669.63583: getting variables 29922 1726853669.63585: in VariableManager get_vars() 29922 1726853669.63618: Calling all_inventory to load vars for managed_node3 29922 1726853669.63620: Calling groups_inventory to load vars for managed_node3 29922 1726853669.63622: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853669.63630: Calling all_plugins_play to load vars for managed_node3 29922 1726853669.63632: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853669.63634: Calling groups_plugins_play to load vars for managed_node3 29922 1726853669.64426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853669.65774: done with get_vars() 29922 1726853669.65795: done getting variables 29922 1726853669.65860: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:34:29 -0400 (0:00:00.044) 0:00:18.588 ****** 29922 1726853669.65885: entering _queue_task() for managed_node3/debug 29922 1726853669.66142: worker is 1 (out of 1 available) 29922 1726853669.66155: exiting _queue_task() for managed_node3/debug 29922 1726853669.66169: done queuing things up, now waiting for results queue to drain 29922 1726853669.66170: waiting for pending results... 29922 1726853669.66359: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 29922 1726853669.66469: in run() - task 02083763-bbaf-51d4-513b-00000000002a 29922 1726853669.66484: variable 'ansible_search_path' from source: unknown 29922 1726853669.66488: variable 'ansible_search_path' from source: unknown 29922 1726853669.66519: calling self._execute() 29922 1726853669.66594: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853669.66599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853669.66610: variable 'omit' from source: magic vars 29922 1726853669.66892: variable 'ansible_distribution_major_version' from source: facts 29922 1726853669.66902: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853669.66908: variable 'omit' from source: magic vars 29922 1726853669.66951: variable 'omit' from source: magic vars 29922 1726853669.66979: variable 'omit' from source: magic vars 29922 1726853669.67010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853669.67038: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853669.67054: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853669.67075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853669.67084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853669.67109: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853669.67112: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853669.67115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853669.67187: Set connection var ansible_connection to ssh 29922 1726853669.67193: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853669.67200: Set connection var ansible_shell_executable to /bin/sh 29922 1726853669.67207: Set connection var ansible_pipelining to False 29922 1726853669.67212: Set connection var ansible_timeout to 10 29922 1726853669.67214: Set connection var ansible_shell_type to sh 29922 1726853669.67232: variable 'ansible_shell_executable' from source: unknown 29922 1726853669.67235: variable 'ansible_connection' from source: unknown 29922 1726853669.67238: variable 'ansible_module_compression' from source: unknown 29922 1726853669.67240: variable 'ansible_shell_type' from source: unknown 29922 1726853669.67242: variable 'ansible_shell_executable' from source: unknown 29922 1726853669.67244: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853669.67247: variable 'ansible_pipelining' from source: unknown 29922 1726853669.67250: variable 'ansible_timeout' from source: unknown 29922 1726853669.67254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853669.67359: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853669.67370: variable 'omit' from source: magic vars 29922 1726853669.67378: starting attempt loop 29922 1726853669.67382: running the handler 29922 1726853669.67421: variable '__network_connections_result' from source: set_fact 29922 1726853669.67483: variable '__network_connections_result' from source: set_fact 29922 1726853669.67718: handler run complete 29922 1726853669.67762: attempt loop complete, returning result 29922 1726853669.67765: _execute() done 29922 1726853669.67767: dumping result to json 29922 1726853669.67776: done dumping result, returning 29922 1726853669.67784: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-51d4-513b-00000000002a] 29922 1726853669.67788: sending task result for task 02083763-bbaf-51d4-513b-00000000002a 29922 1726853669.67893: done sending task result for task 02083763-bbaf-51d4-513b-00000000002a 29922 1726853669.67896: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26", "2001:db8::2/32" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db8::4", "prefix": 32, "table": 30600 } ], "routing_rule": [ { "from": "198.51.100.58/26", "priority": 30200, "table": 30200 }, { "family": "ipv4", "fwmark": 1, "fwmask": 1, "priority": 30201, "table": 30200 }, { "family": "ipv4", "ipproto": 6, "priority": 30202, "table": 30200 }, { "family": "ipv4", "priority": 30203, "sport": "128 - 256", "table": 30200 }, { "family": "ipv4", "priority": 30204, "table": 30200, "tos": 8 }, { "priority": 30400, "table": 30400, "to": "198.51.100.128/26" }, { "family": "ipv4", "iif": "iiftest", "priority": 30401, "table": 30400 }, { "family": "ipv4", "oif": "oiftest", "priority": 30402, "table": 30400 }, { "from": "0.0.0.0/0", "priority": 30403, "table": 30400, "to": "0.0.0.0/0" }, { "priority": 30600, "table": 30600, "to": "2001:db8::4/32" }, { "dport": "128 - 256", "family": "ipv6", "invert": true, "priority": 30601, "table": 30600 }, { "from": "::/0", "priority": 30602, "table": 30600, "to": "::/0" }, { "from": "198.51.100.56/26", "priority": 200, "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 0bb5d45f-2ba9-4c52-9232-d0cdb613594b\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 0bb5d45f-2ba9-4c52-9232-d0cdb613594b (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 0bb5d45f-2ba9-4c52-9232-d0cdb613594b", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 0bb5d45f-2ba9-4c52-9232-d0cdb613594b (not-active)" ] } } 29922 1726853669.68051: no more pending results, returning what we have 29922 1726853669.68054: results queue empty 29922 1726853669.68055: checking for any_errors_fatal 29922 1726853669.68060: done checking for any_errors_fatal 29922 1726853669.68060: checking for max_fail_percentage 29922 1726853669.68062: done checking for max_fail_percentage 29922 1726853669.68062: checking to see if all hosts have failed and the running result is not ok 29922 1726853669.68063: done checking to see if all hosts have failed 29922 1726853669.68064: getting the remaining hosts for this loop 29922 1726853669.68065: done getting the remaining hosts for this loop 29922 1726853669.68068: getting the next task for host managed_node3 29922 1726853669.68081: done getting next task for host managed_node3 29922 1726853669.68085: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 29922 1726853669.68087: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853669.68097: getting variables 29922 1726853669.68098: in VariableManager get_vars() 29922 1726853669.68126: Calling all_inventory to load vars for managed_node3 29922 1726853669.68129: Calling groups_inventory to load vars for managed_node3 29922 1726853669.68131: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853669.68138: Calling all_plugins_play to load vars for managed_node3 29922 1726853669.68140: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853669.68143: Calling groups_plugins_play to load vars for managed_node3 29922 1726853669.68942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853669.70570: done with get_vars() 29922 1726853669.70609: done getting variables 29922 1726853669.70673: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:34:29 -0400 (0:00:00.048) 0:00:18.636 ****** 29922 1726853669.70715: entering _queue_task() for managed_node3/debug 29922 1726853669.71085: worker is 1 (out of 1 available) 29922 1726853669.71100: exiting _queue_task() for managed_node3/debug 29922 1726853669.71112: done queuing things up, now waiting for results queue to drain 29922 1726853669.71114: waiting for pending results... 29922 1726853669.71495: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 29922 1726853669.71501: in run() - task 02083763-bbaf-51d4-513b-00000000002b 29922 1726853669.71505: variable 'ansible_search_path' from source: unknown 29922 1726853669.71509: variable 'ansible_search_path' from source: unknown 29922 1726853669.71525: calling self._execute() 29922 1726853669.71626: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853669.71630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853669.71641: variable 'omit' from source: magic vars 29922 1726853669.72136: variable 'ansible_distribution_major_version' from source: facts 29922 1726853669.72140: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853669.72185: variable 'network_state' from source: role '' defaults 29922 1726853669.72196: Evaluated conditional (network_state != {}): False 29922 1726853669.72199: when evaluation is False, skipping this task 29922 1726853669.72203: _execute() done 29922 1726853669.72206: dumping result to json 29922 1726853669.72208: done dumping result, returning 29922 1726853669.72215: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-51d4-513b-00000000002b] 29922 1726853669.72225: sending task result for task 02083763-bbaf-51d4-513b-00000000002b 29922 1726853669.72312: done sending task result for task 02083763-bbaf-51d4-513b-00000000002b 29922 1726853669.72315: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 29922 1726853669.72386: no more pending results, returning what we have 29922 1726853669.72390: results queue empty 29922 1726853669.72390: checking for any_errors_fatal 29922 1726853669.72401: done checking for any_errors_fatal 29922 1726853669.72401: checking for max_fail_percentage 29922 1726853669.72403: done checking for max_fail_percentage 29922 1726853669.72404: checking to see if all hosts have failed and the running result is not ok 29922 1726853669.72405: done checking to see if all hosts have failed 29922 1726853669.72405: getting the remaining hosts for this loop 29922 1726853669.72407: done getting the remaining hosts for this loop 29922 1726853669.72410: getting the next task for host managed_node3 29922 1726853669.72415: done getting next task for host managed_node3 29922 1726853669.72419: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 29922 1726853669.72423: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853669.72438: getting variables 29922 1726853669.72439: in VariableManager get_vars() 29922 1726853669.72476: Calling all_inventory to load vars for managed_node3 29922 1726853669.72479: Calling groups_inventory to load vars for managed_node3 29922 1726853669.72481: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853669.72490: Calling all_plugins_play to load vars for managed_node3 29922 1726853669.72492: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853669.72494: Calling groups_plugins_play to load vars for managed_node3 29922 1726853669.74039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853669.75735: done with get_vars() 29922 1726853669.75766: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:34:29 -0400 (0:00:00.051) 0:00:18.688 ****** 29922 1726853669.75877: entering _queue_task() for managed_node3/ping 29922 1726853669.75879: Creating lock for ping 29922 1726853669.76587: worker is 1 (out of 1 available) 29922 1726853669.76606: exiting _queue_task() for managed_node3/ping 29922 1726853669.76619: done queuing things up, now waiting for results queue to drain 29922 1726853669.76621: waiting for pending results... 29922 1726853669.77194: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 29922 1726853669.77199: in run() - task 02083763-bbaf-51d4-513b-00000000002c 29922 1726853669.77203: variable 'ansible_search_path' from source: unknown 29922 1726853669.77206: variable 'ansible_search_path' from source: unknown 29922 1726853669.77241: calling self._execute() 29922 1726853669.77361: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853669.77369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853669.77381: variable 'omit' from source: magic vars 29922 1726853669.77779: variable 'ansible_distribution_major_version' from source: facts 29922 1726853669.77798: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853669.77804: variable 'omit' from source: magic vars 29922 1726853669.77856: variable 'omit' from source: magic vars 29922 1726853669.77903: variable 'omit' from source: magic vars 29922 1726853669.77942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853669.77982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853669.78008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853669.78026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853669.78038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853669.78074: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853669.78077: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853669.78080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853669.78186: Set connection var ansible_connection to ssh 29922 1726853669.78193: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853669.78202: Set connection var ansible_shell_executable to /bin/sh 29922 1726853669.78210: Set connection var ansible_pipelining to False 29922 1726853669.78215: Set connection var ansible_timeout to 10 29922 1726853669.78226: Set connection var ansible_shell_type to sh 29922 1726853669.78251: variable 'ansible_shell_executable' from source: unknown 29922 1726853669.78254: variable 'ansible_connection' from source: unknown 29922 1726853669.78257: variable 'ansible_module_compression' from source: unknown 29922 1726853669.78267: variable 'ansible_shell_type' from source: unknown 29922 1726853669.78270: variable 'ansible_shell_executable' from source: unknown 29922 1726853669.78274: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853669.78277: variable 'ansible_pipelining' from source: unknown 29922 1726853669.78279: variable 'ansible_timeout' from source: unknown 29922 1726853669.78281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853669.78576: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853669.78581: variable 'omit' from source: magic vars 29922 1726853669.78584: starting attempt loop 29922 1726853669.78586: running the handler 29922 1726853669.78595: _low_level_execute_command(): starting 29922 1726853669.78597: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853669.79676: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853669.79807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853669.79908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853669.79920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853669.79937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853669.80083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853669.81801: stdout chunk (state=3): >>>/root <<< 29922 1726853669.81986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853669.81989: stdout chunk (state=3): >>><<< 29922 1726853669.81999: stderr chunk (state=3): >>><<< 29922 1726853669.82045: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853669.82065: _low_level_execute_command(): starting 29922 1726853669.82069: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853669.8204498-30818-11250668352252 `" && echo ansible-tmp-1726853669.8204498-30818-11250668352252="` echo /root/.ansible/tmp/ansible-tmp-1726853669.8204498-30818-11250668352252 `" ) && sleep 0' 29922 1726853669.83019: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853669.83036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853669.83050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853669.83070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853669.83092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853669.83105: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853669.83134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853669.83154: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853669.83188: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853669.83264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853669.83293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853669.83387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853669.85400: stdout chunk (state=3): >>>ansible-tmp-1726853669.8204498-30818-11250668352252=/root/.ansible/tmp/ansible-tmp-1726853669.8204498-30818-11250668352252 <<< 29922 1726853669.85529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853669.85546: stdout chunk (state=3): >>><<< 29922 1726853669.85548: stderr chunk (state=3): >>><<< 29922 1726853669.85560: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853669.8204498-30818-11250668352252=/root/.ansible/tmp/ansible-tmp-1726853669.8204498-30818-11250668352252 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853669.85629: variable 'ansible_module_compression' from source: unknown 29922 1726853669.85647: ANSIBALLZ: Using lock for ping 29922 1726853669.85651: ANSIBALLZ: Acquiring lock 29922 1726853669.85653: ANSIBALLZ: Lock acquired: 140376041488064 29922 1726853669.85658: ANSIBALLZ: Creating module 29922 1726853669.96116: ANSIBALLZ: Writing module into payload 29922 1726853669.96157: ANSIBALLZ: Writing module 29922 1726853669.96177: ANSIBALLZ: Renaming module 29922 1726853669.96183: ANSIBALLZ: Done creating module 29922 1726853669.96196: variable 'ansible_facts' from source: unknown 29922 1726853669.96241: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853669.8204498-30818-11250668352252/AnsiballZ_ping.py 29922 1726853669.96341: Sending initial data 29922 1726853669.96344: Sent initial data (152 bytes) 29922 1726853669.96763: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853669.96774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853669.96803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853669.96807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853669.96858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853669.96861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853669.96863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853669.96935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853669.98621: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853669.98686: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853669.98758: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpfs5zhxil /root/.ansible/tmp/ansible-tmp-1726853669.8204498-30818-11250668352252/AnsiballZ_ping.py <<< 29922 1726853669.98761: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853669.8204498-30818-11250668352252/AnsiballZ_ping.py" <<< 29922 1726853669.98854: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpfs5zhxil" to remote "/root/.ansible/tmp/ansible-tmp-1726853669.8204498-30818-11250668352252/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853669.8204498-30818-11250668352252/AnsiballZ_ping.py" <<< 29922 1726853669.99467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853669.99504: stderr chunk (state=3): >>><<< 29922 1726853669.99507: stdout chunk (state=3): >>><<< 29922 1726853669.99538: done transferring module to remote 29922 1726853669.99548: _low_level_execute_command(): starting 29922 1726853669.99559: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853669.8204498-30818-11250668352252/ /root/.ansible/tmp/ansible-tmp-1726853669.8204498-30818-11250668352252/AnsiballZ_ping.py && sleep 0' 29922 1726853669.99961: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853669.99994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853669.99997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.00003: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.00046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853670.00049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853670.00113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853670.01968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853670.01991: stderr chunk (state=3): >>><<< 29922 1726853670.01995: stdout chunk (state=3): >>><<< 29922 1726853670.02007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853670.02010: _low_level_execute_command(): starting 29922 1726853670.02014: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853669.8204498-30818-11250668352252/AnsiballZ_ping.py && sleep 0' 29922 1726853670.02436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853670.02439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.02441: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 29922 1726853670.02444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853670.02446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.02499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853670.02502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853670.02505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853670.02573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853670.18095: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 29922 1726853670.19423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853670.19454: stderr chunk (state=3): >>><<< 29922 1726853670.19460: stdout chunk (state=3): >>><<< 29922 1726853670.19473: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853670.19494: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853669.8204498-30818-11250668352252/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853670.19501: _low_level_execute_command(): starting 29922 1726853670.19506: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853669.8204498-30818-11250668352252/ > /dev/null 2>&1 && sleep 0' 29922 1726853670.19944: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853670.19976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853670.19980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853670.19982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.19984: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853670.19987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.20049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853670.20052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853670.20056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853670.20116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853670.21997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853670.22017: stderr chunk (state=3): >>><<< 29922 1726853670.22020: stdout chunk (state=3): >>><<< 29922 1726853670.22036: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853670.22053: handler run complete 29922 1726853670.22065: attempt loop complete, returning result 29922 1726853670.22068: _execute() done 29922 1726853670.22070: dumping result to json 29922 1726853670.22076: done dumping result, returning 29922 1726853670.22084: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-51d4-513b-00000000002c] 29922 1726853670.22087: sending task result for task 02083763-bbaf-51d4-513b-00000000002c 29922 1726853670.22174: done sending task result for task 02083763-bbaf-51d4-513b-00000000002c 29922 1726853670.22177: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 29922 1726853670.22230: no more pending results, returning what we have 29922 1726853670.22234: results queue empty 29922 1726853670.22234: checking for any_errors_fatal 29922 1726853670.22242: done checking for any_errors_fatal 29922 1726853670.22243: checking for max_fail_percentage 29922 1726853670.22244: done checking for max_fail_percentage 29922 1726853670.22245: checking to see if all hosts have failed and the running result is not ok 29922 1726853670.22245: done checking to see if all hosts have failed 29922 1726853670.22246: getting the remaining hosts for this loop 29922 1726853670.22247: done getting the remaining hosts for this loop 29922 1726853670.22251: getting the next task for host managed_node3 29922 1726853670.22261: done getting next task for host managed_node3 29922 1726853670.22263: ^ task is: TASK: meta (role_complete) 29922 1726853670.22265: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853670.22277: getting variables 29922 1726853670.22279: in VariableManager get_vars() 29922 1726853670.22316: Calling all_inventory to load vars for managed_node3 29922 1726853670.22318: Calling groups_inventory to load vars for managed_node3 29922 1726853670.22320: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853670.22329: Calling all_plugins_play to load vars for managed_node3 29922 1726853670.22332: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853670.22334: Calling groups_plugins_play to load vars for managed_node3 29922 1726853670.23201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853670.24069: done with get_vars() 29922 1726853670.24087: done getting variables 29922 1726853670.24148: done queuing things up, now waiting for results queue to drain 29922 1726853670.24150: results queue empty 29922 1726853670.24150: checking for any_errors_fatal 29922 1726853670.24152: done checking for any_errors_fatal 29922 1726853670.24152: checking for max_fail_percentage 29922 1726853670.24153: done checking for max_fail_percentage 29922 1726853670.24153: checking to see if all hosts have failed and the running result is not ok 29922 1726853670.24154: done checking to see if all hosts have failed 29922 1726853670.24154: getting the remaining hosts for this loop 29922 1726853670.24155: done getting the remaining hosts for this loop 29922 1726853670.24157: getting the next task for host managed_node3 29922 1726853670.24160: done getting next task for host managed_node3 29922 1726853670.24161: ^ task is: TASK: Get the routing rule for looking up the table 30200 29922 1726853670.24162: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853670.24164: getting variables 29922 1726853670.24164: in VariableManager get_vars() 29922 1726853670.24174: Calling all_inventory to load vars for managed_node3 29922 1726853670.24176: Calling groups_inventory to load vars for managed_node3 29922 1726853670.24177: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853670.24180: Calling all_plugins_play to load vars for managed_node3 29922 1726853670.24182: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853670.24183: Calling groups_plugins_play to load vars for managed_node3 29922 1726853670.25309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853670.27480: done with get_vars() 29922 1726853670.27507: done getting variables 29922 1726853670.27553: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30200] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:115 Friday 20 September 2024 13:34:30 -0400 (0:00:00.518) 0:00:19.206 ****** 29922 1726853670.27686: entering _queue_task() for managed_node3/command 29922 1726853670.28588: worker is 1 (out of 1 available) 29922 1726853670.28599: exiting _queue_task() for managed_node3/command 29922 1726853670.28608: done queuing things up, now waiting for results queue to drain 29922 1726853670.28609: waiting for pending results... 29922 1726853670.29138: running TaskExecutor() for managed_node3/TASK: Get the routing rule for looking up the table 30200 29922 1726853670.29190: in run() - task 02083763-bbaf-51d4-513b-00000000005c 29922 1726853670.29205: variable 'ansible_search_path' from source: unknown 29922 1726853670.29242: calling self._execute() 29922 1726853670.29518: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853670.29525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853670.29535: variable 'omit' from source: magic vars 29922 1726853670.30507: variable 'ansible_distribution_major_version' from source: facts 29922 1726853670.30519: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853670.30635: variable 'ansible_distribution_major_version' from source: facts 29922 1726853670.30645: Evaluated conditional (ansible_distribution_major_version != "7"): True 29922 1726853670.30648: variable 'omit' from source: magic vars 29922 1726853670.30681: variable 'omit' from source: magic vars 29922 1726853670.30877: variable 'omit' from source: magic vars 29922 1726853670.30880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853670.30937: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853670.30959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853670.30976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853670.30985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853670.31129: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853670.31132: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853670.31135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853670.31296: Set connection var ansible_connection to ssh 29922 1726853670.31302: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853670.31309: Set connection var ansible_shell_executable to /bin/sh 29922 1726853670.31317: Set connection var ansible_pipelining to False 29922 1726853670.31321: Set connection var ansible_timeout to 10 29922 1726853670.31324: Set connection var ansible_shell_type to sh 29922 1726853670.31464: variable 'ansible_shell_executable' from source: unknown 29922 1726853670.31468: variable 'ansible_connection' from source: unknown 29922 1726853670.31472: variable 'ansible_module_compression' from source: unknown 29922 1726853670.31475: variable 'ansible_shell_type' from source: unknown 29922 1726853670.31477: variable 'ansible_shell_executable' from source: unknown 29922 1726853670.31479: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853670.31482: variable 'ansible_pipelining' from source: unknown 29922 1726853670.31484: variable 'ansible_timeout' from source: unknown 29922 1726853670.31489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853670.31743: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853670.31753: variable 'omit' from source: magic vars 29922 1726853670.31763: starting attempt loop 29922 1726853670.31767: running the handler 29922 1726853670.31890: _low_level_execute_command(): starting 29922 1726853670.31897: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853670.33177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853670.33181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853670.33187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853670.33206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853670.33219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853670.33588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853670.35008: stdout chunk (state=3): >>>/root <<< 29922 1726853670.35241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853670.35247: stdout chunk (state=3): >>><<< 29922 1726853670.35257: stderr chunk (state=3): >>><<< 29922 1726853670.35280: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853670.35295: _low_level_execute_command(): starting 29922 1726853670.35301: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853670.3528125-30839-105370100958542 `" && echo ansible-tmp-1726853670.3528125-30839-105370100958542="` echo /root/.ansible/tmp/ansible-tmp-1726853670.3528125-30839-105370100958542 `" ) && sleep 0' 29922 1726853670.36014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853670.36028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853670.36068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853670.36086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853670.36105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853670.36137: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853670.36147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.36161: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853670.36172: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853670.36280: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.36309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853670.36348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853670.36360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853670.36444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853670.38784: stdout chunk (state=3): >>>ansible-tmp-1726853670.3528125-30839-105370100958542=/root/.ansible/tmp/ansible-tmp-1726853670.3528125-30839-105370100958542 <<< 29922 1726853670.38792: stdout chunk (state=3): >>><<< 29922 1726853670.38796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853670.38801: stderr chunk (state=3): >>><<< 29922 1726853670.38821: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853670.3528125-30839-105370100958542=/root/.ansible/tmp/ansible-tmp-1726853670.3528125-30839-105370100958542 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853670.38865: variable 'ansible_module_compression' from source: unknown 29922 1726853670.38908: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29922 1726853670.38946: variable 'ansible_facts' from source: unknown 29922 1726853670.39181: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853670.3528125-30839-105370100958542/AnsiballZ_command.py 29922 1726853670.39432: Sending initial data 29922 1726853670.39436: Sent initial data (156 bytes) 29922 1726853670.41737: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.41812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853670.41898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853670.42260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853670.43918: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853670.43980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853670.44059: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpr6a2s4s_ /root/.ansible/tmp/ansible-tmp-1726853670.3528125-30839-105370100958542/AnsiballZ_command.py <<< 29922 1726853670.44203: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853670.3528125-30839-105370100958542/AnsiballZ_command.py" <<< 29922 1726853670.44207: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpr6a2s4s_" to remote "/root/.ansible/tmp/ansible-tmp-1726853670.3528125-30839-105370100958542/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853670.3528125-30839-105370100958542/AnsiballZ_command.py" <<< 29922 1726853670.45308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853670.45388: stderr chunk (state=3): >>><<< 29922 1726853670.45402: stdout chunk (state=3): >>><<< 29922 1726853670.45478: done transferring module to remote 29922 1726853670.45496: _low_level_execute_command(): starting 29922 1726853670.45506: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853670.3528125-30839-105370100958542/ /root/.ansible/tmp/ansible-tmp-1726853670.3528125-30839-105370100958542/AnsiballZ_command.py && sleep 0' 29922 1726853670.46103: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853670.46119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853670.46133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853670.46153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853670.46177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853670.46190: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853670.46204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.46228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853670.46246: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853670.46261: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29922 1726853670.46342: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853670.46359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853670.46376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853670.46466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853670.48369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853670.48395: stdout chunk (state=3): >>><<< 29922 1726853670.48407: stderr chunk (state=3): >>><<< 29922 1726853670.48427: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853670.48441: _low_level_execute_command(): starting 29922 1726853670.48452: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853670.3528125-30839-105370100958542/AnsiballZ_command.py && sleep 0' 29922 1726853670.49067: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853670.49098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853670.49193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.49217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853670.49233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853670.49253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853670.49356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853670.65583: stdout chunk (state=3): >>> {"changed": true, "stdout": "30200:\tfrom 198.51.100.58/26 lookup 30200 proto static\n30201:\tfrom all fwmark 0x1/0x1 lookup 30200 proto static\n30202:\tfrom all ipproto tcp lookup 30200 proto static\n30203:\tfrom all sport 128-256 lookup 30200 proto static\n30204:\tfrom all tos throughput lookup 30200 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30200"], "start": "2024-09-20 13:34:30.647743", "end": "2024-09-20 13:34:30.654568", "delta": "0:00:00.006825", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853670.67227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853670.67253: stderr chunk (state=3): >>><<< 29922 1726853670.67259: stdout chunk (state=3): >>><<< 29922 1726853670.67274: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30200:\tfrom 198.51.100.58/26 lookup 30200 proto static\n30201:\tfrom all fwmark 0x1/0x1 lookup 30200 proto static\n30202:\tfrom all ipproto tcp lookup 30200 proto static\n30203:\tfrom all sport 128-256 lookup 30200 proto static\n30204:\tfrom all tos throughput lookup 30200 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30200"], "start": "2024-09-20 13:34:30.647743", "end": "2024-09-20 13:34:30.654568", "delta": "0:00:00.006825", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853670.67304: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table 30200', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853670.3528125-30839-105370100958542/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853670.67312: _low_level_execute_command(): starting 29922 1726853670.67318: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853670.3528125-30839-105370100958542/ > /dev/null 2>&1 && sleep 0' 29922 1726853670.67758: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853670.67793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853670.67797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.67799: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853670.67801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.67852: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853670.67855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853670.67857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853670.67923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853670.69829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853670.69856: stderr chunk (state=3): >>><<< 29922 1726853670.69860: stdout chunk (state=3): >>><<< 29922 1726853670.69879: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853670.69885: handler run complete 29922 1726853670.69904: Evaluated conditional (False): False 29922 1726853670.69915: attempt loop complete, returning result 29922 1726853670.69918: _execute() done 29922 1726853670.69920: dumping result to json 29922 1726853670.69925: done dumping result, returning 29922 1726853670.69933: done running TaskExecutor() for managed_node3/TASK: Get the routing rule for looking up the table 30200 [02083763-bbaf-51d4-513b-00000000005c] 29922 1726853670.69937: sending task result for task 02083763-bbaf-51d4-513b-00000000005c 29922 1726853670.70036: done sending task result for task 02083763-bbaf-51d4-513b-00000000005c 29922 1726853670.70038: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "30200" ], "delta": "0:00:00.006825", "end": "2024-09-20 13:34:30.654568", "rc": 0, "start": "2024-09-20 13:34:30.647743" } STDOUT: 30200: from 198.51.100.58/26 lookup 30200 proto static 30201: from all fwmark 0x1/0x1 lookup 30200 proto static 30202: from all ipproto tcp lookup 30200 proto static 30203: from all sport 128-256 lookup 30200 proto static 30204: from all tos throughput lookup 30200 proto static 29922 1726853670.70107: no more pending results, returning what we have 29922 1726853670.70110: results queue empty 29922 1726853670.70111: checking for any_errors_fatal 29922 1726853670.70113: done checking for any_errors_fatal 29922 1726853670.70114: checking for max_fail_percentage 29922 1726853670.70115: done checking for max_fail_percentage 29922 1726853670.70116: checking to see if all hosts have failed and the running result is not ok 29922 1726853670.70117: done checking to see if all hosts have failed 29922 1726853670.70117: getting the remaining hosts for this loop 29922 1726853670.70119: done getting the remaining hosts for this loop 29922 1726853670.70122: getting the next task for host managed_node3 29922 1726853670.70128: done getting next task for host managed_node3 29922 1726853670.70130: ^ task is: TASK: Get the routing rule for looking up the table 30400 29922 1726853670.70132: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853670.70137: getting variables 29922 1726853670.70138: in VariableManager get_vars() 29922 1726853670.70176: Calling all_inventory to load vars for managed_node3 29922 1726853670.70178: Calling groups_inventory to load vars for managed_node3 29922 1726853670.70180: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853670.70192: Calling all_plugins_play to load vars for managed_node3 29922 1726853670.70194: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853670.70197: Calling groups_plugins_play to load vars for managed_node3 29922 1726853670.71013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853670.71973: done with get_vars() 29922 1726853670.71988: done getting variables 29922 1726853670.72032: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30400] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:122 Friday 20 September 2024 13:34:30 -0400 (0:00:00.443) 0:00:19.650 ****** 29922 1726853670.72051: entering _queue_task() for managed_node3/command 29922 1726853670.72292: worker is 1 (out of 1 available) 29922 1726853670.72306: exiting _queue_task() for managed_node3/command 29922 1726853670.72320: done queuing things up, now waiting for results queue to drain 29922 1726853670.72321: waiting for pending results... 29922 1726853670.72504: running TaskExecutor() for managed_node3/TASK: Get the routing rule for looking up the table 30400 29922 1726853670.72569: in run() - task 02083763-bbaf-51d4-513b-00000000005d 29922 1726853670.72581: variable 'ansible_search_path' from source: unknown 29922 1726853670.72610: calling self._execute() 29922 1726853670.72689: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853670.72695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853670.72705: variable 'omit' from source: magic vars 29922 1726853670.73290: variable 'ansible_distribution_major_version' from source: facts 29922 1726853670.73294: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853670.73296: variable 'ansible_distribution_major_version' from source: facts 29922 1726853670.73299: Evaluated conditional (ansible_distribution_major_version != "7"): True 29922 1726853670.73301: variable 'omit' from source: magic vars 29922 1726853670.73303: variable 'omit' from source: magic vars 29922 1726853670.73305: variable 'omit' from source: magic vars 29922 1726853670.73333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853670.73370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853670.73389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853670.73406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853670.73425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853670.73447: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853670.73450: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853670.73453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853670.73547: Set connection var ansible_connection to ssh 29922 1726853670.73555: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853670.73643: Set connection var ansible_shell_executable to /bin/sh 29922 1726853670.73647: Set connection var ansible_pipelining to False 29922 1726853670.73649: Set connection var ansible_timeout to 10 29922 1726853670.73651: Set connection var ansible_shell_type to sh 29922 1726853670.73653: variable 'ansible_shell_executable' from source: unknown 29922 1726853670.73656: variable 'ansible_connection' from source: unknown 29922 1726853670.73657: variable 'ansible_module_compression' from source: unknown 29922 1726853670.73659: variable 'ansible_shell_type' from source: unknown 29922 1726853670.73661: variable 'ansible_shell_executable' from source: unknown 29922 1726853670.73663: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853670.73665: variable 'ansible_pipelining' from source: unknown 29922 1726853670.73667: variable 'ansible_timeout' from source: unknown 29922 1726853670.73669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853670.73859: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853670.73870: variable 'omit' from source: magic vars 29922 1726853670.73881: starting attempt loop 29922 1726853670.73883: running the handler 29922 1726853670.73902: _low_level_execute_command(): starting 29922 1726853670.73955: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853670.74448: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853670.74457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853670.74467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.74489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853670.74493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.74550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853670.74558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853670.74560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853670.74621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853670.76625: stdout chunk (state=3): >>>/root <<< 29922 1726853670.76628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853670.76630: stdout chunk (state=3): >>><<< 29922 1726853670.76633: stderr chunk (state=3): >>><<< 29922 1726853670.76651: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853670.76670: _low_level_execute_command(): starting 29922 1726853670.76745: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853670.7665782-30870-139451600892715 `" && echo ansible-tmp-1726853670.7665782-30870-139451600892715="` echo /root/.ansible/tmp/ansible-tmp-1726853670.7665782-30870-139451600892715 `" ) && sleep 0' 29922 1726853670.77257: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853670.77275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853670.77291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853670.77394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.77416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853670.77440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853670.77459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853670.77555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853670.79534: stdout chunk (state=3): >>>ansible-tmp-1726853670.7665782-30870-139451600892715=/root/.ansible/tmp/ansible-tmp-1726853670.7665782-30870-139451600892715 <<< 29922 1726853670.79691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853670.79694: stdout chunk (state=3): >>><<< 29922 1726853670.79697: stderr chunk (state=3): >>><<< 29922 1726853670.79719: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853670.7665782-30870-139451600892715=/root/.ansible/tmp/ansible-tmp-1726853670.7665782-30870-139451600892715 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853670.79755: variable 'ansible_module_compression' from source: unknown 29922 1726853670.79821: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29922 1726853670.80035: variable 'ansible_facts' from source: unknown 29922 1726853670.80038: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853670.7665782-30870-139451600892715/AnsiballZ_command.py 29922 1726853670.80189: Sending initial data 29922 1726853670.80192: Sent initial data (156 bytes) 29922 1726853670.80702: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853670.80711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853670.80720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853670.80735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853670.80786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853670.80836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853670.80849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853670.80899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853670.80964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853670.82563: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853670.82615: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853670.82675: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp76etgz1d /root/.ansible/tmp/ansible-tmp-1726853670.7665782-30870-139451600892715/AnsiballZ_command.py <<< 29922 1726853670.82678: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853670.7665782-30870-139451600892715/AnsiballZ_command.py" <<< 29922 1726853670.82733: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp76etgz1d" to remote "/root/.ansible/tmp/ansible-tmp-1726853670.7665782-30870-139451600892715/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853670.7665782-30870-139451600892715/AnsiballZ_command.py" <<< 29922 1726853670.83380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853670.83383: stderr chunk (state=3): >>><<< 29922 1726853670.83389: stdout chunk (state=3): >>><<< 29922 1726853670.83449: done transferring module to remote 29922 1726853670.83459: _low_level_execute_command(): starting 29922 1726853670.83462: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853670.7665782-30870-139451600892715/ /root/.ansible/tmp/ansible-tmp-1726853670.7665782-30870-139451600892715/AnsiballZ_command.py && sleep 0' 29922 1726853670.84106: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853670.84109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853670.84112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853670.84115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853670.84121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853670.84181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853670.84192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853670.84321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853670.86090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853670.86114: stderr chunk (state=3): >>><<< 29922 1726853670.86117: stdout chunk (state=3): >>><<< 29922 1726853670.86134: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853670.86137: _low_level_execute_command(): starting 29922 1726853670.86139: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853670.7665782-30870-139451600892715/AnsiballZ_command.py && sleep 0' 29922 1726853670.86714: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853670.86793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853671.02706: stdout chunk (state=3): >>> {"changed": true, "stdout": "30400:\tfrom all to 198.51.100.128/26 lookup 30400 proto static\n30401:\tfrom all iif iiftest [detached] lookup 30400 proto static\n30402:\tfrom all oif oiftest [detached] lookup 30400 proto static\n30403:\tfrom all lookup 30400 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30400"], "start": "2024-09-20 13:34:31.020663", "end": "2024-09-20 13:34:31.024594", "delta": "0:00:00.003931", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853671.04379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853671.04382: stdout chunk (state=3): >>><<< 29922 1726853671.04385: stderr chunk (state=3): >>><<< 29922 1726853671.04387: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30400:\tfrom all to 198.51.100.128/26 lookup 30400 proto static\n30401:\tfrom all iif iiftest [detached] lookup 30400 proto static\n30402:\tfrom all oif oiftest [detached] lookup 30400 proto static\n30403:\tfrom all lookup 30400 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30400"], "start": "2024-09-20 13:34:31.020663", "end": "2024-09-20 13:34:31.024594", "delta": "0:00:00.003931", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853671.04390: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table 30400', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853670.7665782-30870-139451600892715/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853671.04393: _low_level_execute_command(): starting 29922 1726853671.04395: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853670.7665782-30870-139451600892715/ > /dev/null 2>&1 && sleep 0' 29922 1726853671.04897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853671.04905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853671.04924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853671.04939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853671.04952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853671.04960: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853671.04970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853671.04992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853671.05000: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853671.05007: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29922 1726853671.05015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853671.05031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853671.05043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853671.05050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853671.05082: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853671.05123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853671.05149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853671.05158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853671.05237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853671.07173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853671.07178: stdout chunk (state=3): >>><<< 29922 1726853671.07181: stderr chunk (state=3): >>><<< 29922 1726853671.07198: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853671.07203: handler run complete 29922 1726853671.07228: Evaluated conditional (False): False 29922 1726853671.07244: attempt loop complete, returning result 29922 1726853671.07247: _execute() done 29922 1726853671.07249: dumping result to json 29922 1726853671.07251: done dumping result, returning 29922 1726853671.07292: done running TaskExecutor() for managed_node3/TASK: Get the routing rule for looking up the table 30400 [02083763-bbaf-51d4-513b-00000000005d] 29922 1726853671.07296: sending task result for task 02083763-bbaf-51d4-513b-00000000005d 29922 1726853671.07378: done sending task result for task 02083763-bbaf-51d4-513b-00000000005d 29922 1726853671.07381: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "30400" ], "delta": "0:00:00.003931", "end": "2024-09-20 13:34:31.024594", "rc": 0, "start": "2024-09-20 13:34:31.020663" } STDOUT: 30400: from all to 198.51.100.128/26 lookup 30400 proto static 30401: from all iif iiftest [detached] lookup 30400 proto static 30402: from all oif oiftest [detached] lookup 30400 proto static 30403: from all lookup 30400 proto static 29922 1726853671.07597: no more pending results, returning what we have 29922 1726853671.07601: results queue empty 29922 1726853671.07602: checking for any_errors_fatal 29922 1726853671.07610: done checking for any_errors_fatal 29922 1726853671.07611: checking for max_fail_percentage 29922 1726853671.07612: done checking for max_fail_percentage 29922 1726853671.07614: checking to see if all hosts have failed and the running result is not ok 29922 1726853671.07614: done checking to see if all hosts have failed 29922 1726853671.07615: getting the remaining hosts for this loop 29922 1726853671.07617: done getting the remaining hosts for this loop 29922 1726853671.07620: getting the next task for host managed_node3 29922 1726853671.07626: done getting next task for host managed_node3 29922 1726853671.07628: ^ task is: TASK: Get the routing rule for looking up the table 30600 29922 1726853671.07631: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853671.07635: getting variables 29922 1726853671.07637: in VariableManager get_vars() 29922 1726853671.07813: Calling all_inventory to load vars for managed_node3 29922 1726853671.07816: Calling groups_inventory to load vars for managed_node3 29922 1726853671.07819: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853671.07829: Calling all_plugins_play to load vars for managed_node3 29922 1726853671.07833: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853671.07836: Calling groups_plugins_play to load vars for managed_node3 29922 1726853671.09172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853671.10030: done with get_vars() 29922 1726853671.10046: done getting variables 29922 1726853671.10094: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30600] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:129 Friday 20 September 2024 13:34:31 -0400 (0:00:00.380) 0:00:20.030 ****** 29922 1726853671.10116: entering _queue_task() for managed_node3/command 29922 1726853671.10361: worker is 1 (out of 1 available) 29922 1726853671.10377: exiting _queue_task() for managed_node3/command 29922 1726853671.10390: done queuing things up, now waiting for results queue to drain 29922 1726853671.10392: waiting for pending results... 29922 1726853671.10742: running TaskExecutor() for managed_node3/TASK: Get the routing rule for looking up the table 30600 29922 1726853671.10914: in run() - task 02083763-bbaf-51d4-513b-00000000005e 29922 1726853671.11044: variable 'ansible_search_path' from source: unknown 29922 1726853671.11283: calling self._execute() 29922 1726853671.11334: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853671.11364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853671.11465: variable 'omit' from source: magic vars 29922 1726853671.11868: variable 'ansible_distribution_major_version' from source: facts 29922 1726853671.11937: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853671.12088: variable 'ansible_distribution_major_version' from source: facts 29922 1726853671.12100: Evaluated conditional (ansible_distribution_major_version != "7"): True 29922 1726853671.12106: variable 'omit' from source: magic vars 29922 1726853671.12122: variable 'omit' from source: magic vars 29922 1726853671.12147: variable 'omit' from source: magic vars 29922 1726853671.12187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853671.12216: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853671.12232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853671.12245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853671.12255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853671.12283: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853671.12286: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853671.12288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853671.12357: Set connection var ansible_connection to ssh 29922 1726853671.12366: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853671.12374: Set connection var ansible_shell_executable to /bin/sh 29922 1726853671.12381: Set connection var ansible_pipelining to False 29922 1726853671.12386: Set connection var ansible_timeout to 10 29922 1726853671.12388: Set connection var ansible_shell_type to sh 29922 1726853671.12406: variable 'ansible_shell_executable' from source: unknown 29922 1726853671.12409: variable 'ansible_connection' from source: unknown 29922 1726853671.12412: variable 'ansible_module_compression' from source: unknown 29922 1726853671.12424: variable 'ansible_shell_type' from source: unknown 29922 1726853671.12427: variable 'ansible_shell_executable' from source: unknown 29922 1726853671.12429: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853671.12431: variable 'ansible_pipelining' from source: unknown 29922 1726853671.12433: variable 'ansible_timeout' from source: unknown 29922 1726853671.12435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853671.12534: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853671.12541: variable 'omit' from source: magic vars 29922 1726853671.12548: starting attempt loop 29922 1726853671.12551: running the handler 29922 1726853671.12566: _low_level_execute_command(): starting 29922 1726853671.12574: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853671.13036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853671.13076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853671.13084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853671.13087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853671.13090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853671.13129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853671.13132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853671.13138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853671.13201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853671.15204: stdout chunk (state=3): >>>/root <<< 29922 1726853671.15208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853671.15254: stderr chunk (state=3): >>><<< 29922 1726853671.15260: stdout chunk (state=3): >>><<< 29922 1726853671.15307: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853671.15323: _low_level_execute_command(): starting 29922 1726853671.15326: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853671.153053-30890-38023242881225 `" && echo ansible-tmp-1726853671.153053-30890-38023242881225="` echo /root/.ansible/tmp/ansible-tmp-1726853671.153053-30890-38023242881225 `" ) && sleep 0' 29922 1726853671.16267: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853671.16306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853671.16359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853671.18366: stdout chunk (state=3): >>>ansible-tmp-1726853671.153053-30890-38023242881225=/root/.ansible/tmp/ansible-tmp-1726853671.153053-30890-38023242881225 <<< 29922 1726853671.18614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853671.18618: stdout chunk (state=3): >>><<< 29922 1726853671.18620: stderr chunk (state=3): >>><<< 29922 1726853671.18623: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853671.153053-30890-38023242881225=/root/.ansible/tmp/ansible-tmp-1726853671.153053-30890-38023242881225 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853671.18880: variable 'ansible_module_compression' from source: unknown 29922 1726853671.18884: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29922 1726853671.18887: variable 'ansible_facts' from source: unknown 29922 1726853671.18961: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853671.153053-30890-38023242881225/AnsiballZ_command.py 29922 1726853671.19394: Sending initial data 29922 1726853671.19397: Sent initial data (154 bytes) 29922 1726853671.20290: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853671.20309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853671.20332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853671.20431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853671.22087: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 29922 1726853671.22110: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853671.22192: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853671.22282: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp6c2ufpbp /root/.ansible/tmp/ansible-tmp-1726853671.153053-30890-38023242881225/AnsiballZ_command.py <<< 29922 1726853671.22293: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853671.153053-30890-38023242881225/AnsiballZ_command.py" <<< 29922 1726853671.22374: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 29922 1726853671.22404: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp6c2ufpbp" to remote "/root/.ansible/tmp/ansible-tmp-1726853671.153053-30890-38023242881225/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853671.153053-30890-38023242881225/AnsiballZ_command.py" <<< 29922 1726853671.23421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853671.23508: stderr chunk (state=3): >>><<< 29922 1726853671.23511: stdout chunk (state=3): >>><<< 29922 1726853671.23615: done transferring module to remote 29922 1726853671.23619: _low_level_execute_command(): starting 29922 1726853671.23623: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853671.153053-30890-38023242881225/ /root/.ansible/tmp/ansible-tmp-1726853671.153053-30890-38023242881225/AnsiballZ_command.py && sleep 0' 29922 1726853671.24401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853671.24415: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853671.24492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853671.24549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853671.24636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853671.26543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853671.26546: stdout chunk (state=3): >>><<< 29922 1726853671.26549: stderr chunk (state=3): >>><<< 29922 1726853671.26569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853671.26660: _low_level_execute_command(): starting 29922 1726853671.26664: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853671.153053-30890-38023242881225/AnsiballZ_command.py && sleep 0' 29922 1726853671.27211: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853671.27224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853671.27239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853671.27284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853671.27300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853671.27380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853671.27405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853671.27437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853671.27515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853671.43485: stdout chunk (state=3): >>> {"changed": true, "stdout": "30600:\tfrom all to 2001:db8::4/32 lookup 30600 proto static\n30601:\tnot from all dport 128-256 lookup 30600 proto static\n30602:\tfrom all lookup 30600 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "rule", "list", "table", "30600"], "start": "2024-09-20 13:34:31.429873", "end": "2024-09-20 13:34:31.433729", "delta": "0:00:00.003856", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 rule list table 30600", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853671.45421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853671.45425: stdout chunk (state=3): >>><<< 29922 1726853671.45427: stderr chunk (state=3): >>><<< 29922 1726853671.45429: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30600:\tfrom all to 2001:db8::4/32 lookup 30600 proto static\n30601:\tnot from all dport 128-256 lookup 30600 proto static\n30602:\tfrom all lookup 30600 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "rule", "list", "table", "30600"], "start": "2024-09-20 13:34:31.429873", "end": "2024-09-20 13:34:31.433729", "delta": "0:00:00.003856", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 rule list table 30600", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853671.45433: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 rule list table 30600', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853671.153053-30890-38023242881225/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853671.45435: _low_level_execute_command(): starting 29922 1726853671.45437: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853671.153053-30890-38023242881225/ > /dev/null 2>&1 && sleep 0' 29922 1726853671.46479: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853671.46534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853671.46538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853671.46541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853671.46543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853671.46628: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853671.46639: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853671.46652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853671.46750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853671.48626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853671.48688: stderr chunk (state=3): >>><<< 29922 1726853671.48691: stdout chunk (state=3): >>><<< 29922 1726853671.48717: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853671.48725: handler run complete 29922 1726853671.48749: Evaluated conditional (False): False 29922 1726853671.48763: attempt loop complete, returning result 29922 1726853671.48766: _execute() done 29922 1726853671.48768: dumping result to json 29922 1726853671.48976: done dumping result, returning 29922 1726853671.48979: done running TaskExecutor() for managed_node3/TASK: Get the routing rule for looking up the table 30600 [02083763-bbaf-51d4-513b-00000000005e] 29922 1726853671.48981: sending task result for task 02083763-bbaf-51d4-513b-00000000005e 29922 1726853671.49045: done sending task result for task 02083763-bbaf-51d4-513b-00000000005e 29922 1726853671.49048: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "-6", "rule", "list", "table", "30600" ], "delta": "0:00:00.003856", "end": "2024-09-20 13:34:31.433729", "rc": 0, "start": "2024-09-20 13:34:31.429873" } STDOUT: 30600: from all to 2001:db8::4/32 lookup 30600 proto static 30601: not from all dport 128-256 lookup 30600 proto static 30602: from all lookup 30600 proto static 29922 1726853671.49157: no more pending results, returning what we have 29922 1726853671.49160: results queue empty 29922 1726853671.49161: checking for any_errors_fatal 29922 1726853671.49168: done checking for any_errors_fatal 29922 1726853671.49169: checking for max_fail_percentage 29922 1726853671.49173: done checking for max_fail_percentage 29922 1726853671.49174: checking to see if all hosts have failed and the running result is not ok 29922 1726853671.49175: done checking to see if all hosts have failed 29922 1726853671.49175: getting the remaining hosts for this loop 29922 1726853671.49176: done getting the remaining hosts for this loop 29922 1726853671.49181: getting the next task for host managed_node3 29922 1726853671.49187: done getting next task for host managed_node3 29922 1726853671.49189: ^ task is: TASK: Get the routing rule for looking up the table 'custom' 29922 1726853671.49191: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853671.49196: getting variables 29922 1726853671.49197: in VariableManager get_vars() 29922 1726853671.49232: Calling all_inventory to load vars for managed_node3 29922 1726853671.49235: Calling groups_inventory to load vars for managed_node3 29922 1726853671.49237: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853671.49248: Calling all_plugins_play to load vars for managed_node3 29922 1726853671.49251: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853671.49254: Calling groups_plugins_play to load vars for managed_node3 29922 1726853671.51030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853671.57273: done with get_vars() 29922 1726853671.57296: done getting variables 29922 1726853671.57349: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 'custom'] ****************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:136 Friday 20 September 2024 13:34:31 -0400 (0:00:00.472) 0:00:20.503 ****** 29922 1726853671.57375: entering _queue_task() for managed_node3/command 29922 1726853671.57881: worker is 1 (out of 1 available) 29922 1726853671.57893: exiting _queue_task() for managed_node3/command 29922 1726853671.57904: done queuing things up, now waiting for results queue to drain 29922 1726853671.57905: waiting for pending results... 29922 1726853671.58097: running TaskExecutor() for managed_node3/TASK: Get the routing rule for looking up the table 'custom' 29922 1726853671.58196: in run() - task 02083763-bbaf-51d4-513b-00000000005f 29922 1726853671.58476: variable 'ansible_search_path' from source: unknown 29922 1726853671.58480: calling self._execute() 29922 1726853671.58484: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853671.58487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853671.58489: variable 'omit' from source: magic vars 29922 1726853671.58781: variable 'ansible_distribution_major_version' from source: facts 29922 1726853671.58800: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853671.58923: variable 'ansible_distribution_major_version' from source: facts 29922 1726853671.58926: Evaluated conditional (ansible_distribution_major_version != "7"): True 29922 1726853671.58934: variable 'omit' from source: magic vars 29922 1726853671.58953: variable 'omit' from source: magic vars 29922 1726853671.58990: variable 'omit' from source: magic vars 29922 1726853671.59043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853671.59080: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853671.59100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853671.59124: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853671.59136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853671.59170: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853671.59175: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853671.59177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853671.59376: Set connection var ansible_connection to ssh 29922 1726853671.59380: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853671.59383: Set connection var ansible_shell_executable to /bin/sh 29922 1726853671.59386: Set connection var ansible_pipelining to False 29922 1726853671.59388: Set connection var ansible_timeout to 10 29922 1726853671.59390: Set connection var ansible_shell_type to sh 29922 1726853671.59392: variable 'ansible_shell_executable' from source: unknown 29922 1726853671.59394: variable 'ansible_connection' from source: unknown 29922 1726853671.59396: variable 'ansible_module_compression' from source: unknown 29922 1726853671.59399: variable 'ansible_shell_type' from source: unknown 29922 1726853671.59401: variable 'ansible_shell_executable' from source: unknown 29922 1726853671.59403: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853671.59405: variable 'ansible_pipelining' from source: unknown 29922 1726853671.59407: variable 'ansible_timeout' from source: unknown 29922 1726853671.59409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853671.59496: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853671.59508: variable 'omit' from source: magic vars 29922 1726853671.59514: starting attempt loop 29922 1726853671.59517: running the handler 29922 1726853671.59533: _low_level_execute_command(): starting 29922 1726853671.59540: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853671.60288: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853671.60300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853671.60313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853671.60329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853671.60342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853671.60362: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853671.60374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853671.60390: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853671.60400: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853671.60407: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29922 1726853671.60416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853671.60426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853671.60438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853671.60446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853671.60576: stderr chunk (state=3): >>>debug2: match found <<< 29922 1726853671.60580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853671.60582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853671.60585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853671.60669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853671.62378: stdout chunk (state=3): >>>/root <<< 29922 1726853671.62473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853671.62514: stderr chunk (state=3): >>><<< 29922 1726853671.62517: stdout chunk (state=3): >>><<< 29922 1726853671.62546: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853671.62560: _low_level_execute_command(): starting 29922 1726853671.62564: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853671.6254458-30923-260163236741265 `" && echo ansible-tmp-1726853671.6254458-30923-260163236741265="` echo /root/.ansible/tmp/ansible-tmp-1726853671.6254458-30923-260163236741265 `" ) && sleep 0' 29922 1726853671.63186: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853671.63217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853671.63229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853671.63249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853671.63335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853671.65309: stdout chunk (state=3): >>>ansible-tmp-1726853671.6254458-30923-260163236741265=/root/.ansible/tmp/ansible-tmp-1726853671.6254458-30923-260163236741265 <<< 29922 1726853671.65430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853671.65447: stderr chunk (state=3): >>><<< 29922 1726853671.65450: stdout chunk (state=3): >>><<< 29922 1726853671.65479: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853671.6254458-30923-260163236741265=/root/.ansible/tmp/ansible-tmp-1726853671.6254458-30923-260163236741265 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853671.65523: variable 'ansible_module_compression' from source: unknown 29922 1726853671.65574: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29922 1726853671.65677: variable 'ansible_facts' from source: unknown 29922 1726853671.65716: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853671.6254458-30923-260163236741265/AnsiballZ_command.py 29922 1726853671.65951: Sending initial data 29922 1726853671.65963: Sent initial data (156 bytes) 29922 1726853671.66445: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853671.66488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853671.66582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853671.66585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853671.66626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853671.66694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853671.68268: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853671.68324: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853671.68383: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpavzc93vf /root/.ansible/tmp/ansible-tmp-1726853671.6254458-30923-260163236741265/AnsiballZ_command.py <<< 29922 1726853671.68386: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853671.6254458-30923-260163236741265/AnsiballZ_command.py" <<< 29922 1726853671.68440: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpavzc93vf" to remote "/root/.ansible/tmp/ansible-tmp-1726853671.6254458-30923-260163236741265/AnsiballZ_command.py" <<< 29922 1726853671.68443: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853671.6254458-30923-260163236741265/AnsiballZ_command.py" <<< 29922 1726853671.69028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853671.69214: stderr chunk (state=3): >>><<< 29922 1726853671.69217: stdout chunk (state=3): >>><<< 29922 1726853671.69220: done transferring module to remote 29922 1726853671.69223: _low_level_execute_command(): starting 29922 1726853671.69226: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853671.6254458-30923-260163236741265/ /root/.ansible/tmp/ansible-tmp-1726853671.6254458-30923-260163236741265/AnsiballZ_command.py && sleep 0' 29922 1726853671.69768: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853671.69854: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853671.69870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853671.69900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853671.70046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853671.70094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853671.70105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853671.70181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853671.70281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853671.72121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853671.72146: stderr chunk (state=3): >>><<< 29922 1726853671.72149: stdout chunk (state=3): >>><<< 29922 1726853671.72161: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853671.72186: _low_level_execute_command(): starting 29922 1726853671.72192: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853671.6254458-30923-260163236741265/AnsiballZ_command.py && sleep 0' 29922 1726853671.72622: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853671.72626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853671.72629: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 29922 1726853671.72631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853671.72634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853671.72676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853671.72691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853671.72764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853671.88449: stdout chunk (state=3): >>> {"changed": true, "stdout": "200:\tfrom 198.51.100.56/26 lookup custom proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "custom"], "start": "2024-09-20 13:34:31.879421", "end": "2024-09-20 13:34:31.883165", "delta": "0:00:00.003744", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853671.90187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853671.90191: stdout chunk (state=3): >>><<< 29922 1726853671.90194: stderr chunk (state=3): >>><<< 29922 1726853671.90196: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "200:\tfrom 198.51.100.56/26 lookup custom proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "custom"], "start": "2024-09-20 13:34:31.879421", "end": "2024-09-20 13:34:31.883165", "delta": "0:00:00.003744", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853671.90199: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table custom', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853671.6254458-30923-260163236741265/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853671.90202: _low_level_execute_command(): starting 29922 1726853671.90204: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853671.6254458-30923-260163236741265/ > /dev/null 2>&1 && sleep 0' 29922 1726853671.91094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853671.91140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853671.91152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853671.91172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853671.91260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853671.93235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853671.93477: stderr chunk (state=3): >>><<< 29922 1726853671.93481: stdout chunk (state=3): >>><<< 29922 1726853671.93484: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853671.93487: handler run complete 29922 1726853671.93489: Evaluated conditional (False): False 29922 1726853671.93491: attempt loop complete, returning result 29922 1726853671.93493: _execute() done 29922 1726853671.93495: dumping result to json 29922 1726853671.93497: done dumping result, returning 29922 1726853671.93499: done running TaskExecutor() for managed_node3/TASK: Get the routing rule for looking up the table 'custom' [02083763-bbaf-51d4-513b-00000000005f] 29922 1726853671.93501: sending task result for task 02083763-bbaf-51d4-513b-00000000005f 29922 1726853671.93590: done sending task result for task 02083763-bbaf-51d4-513b-00000000005f 29922 1726853671.93593: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "custom" ], "delta": "0:00:00.003744", "end": "2024-09-20 13:34:31.883165", "rc": 0, "start": "2024-09-20 13:34:31.879421" } STDOUT: 200: from 198.51.100.56/26 lookup custom proto static 29922 1726853671.93679: no more pending results, returning what we have 29922 1726853671.93684: results queue empty 29922 1726853671.93685: checking for any_errors_fatal 29922 1726853671.93696: done checking for any_errors_fatal 29922 1726853671.93696: checking for max_fail_percentage 29922 1726853671.93698: done checking for max_fail_percentage 29922 1726853671.93700: checking to see if all hosts have failed and the running result is not ok 29922 1726853671.93701: done checking to see if all hosts have failed 29922 1726853671.93701: getting the remaining hosts for this loop 29922 1726853671.93703: done getting the remaining hosts for this loop 29922 1726853671.93707: getting the next task for host managed_node3 29922 1726853671.93714: done getting next task for host managed_node3 29922 1726853671.93716: ^ task is: TASK: Get the IPv4 routing rule for the connection "{{ interface }}" 29922 1726853671.93718: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853671.93723: getting variables 29922 1726853671.93725: in VariableManager get_vars() 29922 1726853671.93770: Calling all_inventory to load vars for managed_node3 29922 1726853671.93889: Calling groups_inventory to load vars for managed_node3 29922 1726853671.93893: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853671.93906: Calling all_plugins_play to load vars for managed_node3 29922 1726853671.93910: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853671.93913: Calling groups_plugins_play to load vars for managed_node3 29922 1726853671.95675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853671.97340: done with get_vars() 29922 1726853671.97379: done getting variables 29922 1726853671.97441: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853671.97585: variable 'interface' from source: set_fact TASK [Get the IPv4 routing rule for the connection "ethtest0"] ***************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:143 Friday 20 September 2024 13:34:31 -0400 (0:00:00.402) 0:00:20.905 ****** 29922 1726853671.97615: entering _queue_task() for managed_node3/command 29922 1726853671.97994: worker is 1 (out of 1 available) 29922 1726853671.98006: exiting _queue_task() for managed_node3/command 29922 1726853671.98133: done queuing things up, now waiting for results queue to drain 29922 1726853671.98135: waiting for pending results... 29922 1726853671.98493: running TaskExecutor() for managed_node3/TASK: Get the IPv4 routing rule for the connection "ethtest0" 29922 1726853671.98499: in run() - task 02083763-bbaf-51d4-513b-000000000060 29922 1726853671.98502: variable 'ansible_search_path' from source: unknown 29922 1726853671.98522: calling self._execute() 29922 1726853671.98635: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853671.98699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853671.98702: variable 'omit' from source: magic vars 29922 1726853671.99082: variable 'ansible_distribution_major_version' from source: facts 29922 1726853671.99100: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853671.99113: variable 'omit' from source: magic vars 29922 1726853671.99145: variable 'omit' from source: magic vars 29922 1726853671.99252: variable 'interface' from source: set_fact 29922 1726853671.99281: variable 'omit' from source: magic vars 29922 1726853671.99324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853671.99462: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853671.99466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853671.99468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853671.99472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853671.99475: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853671.99483: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853671.99492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853671.99601: Set connection var ansible_connection to ssh 29922 1726853671.99615: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853671.99628: Set connection var ansible_shell_executable to /bin/sh 29922 1726853671.99640: Set connection var ansible_pipelining to False 29922 1726853671.99649: Set connection var ansible_timeout to 10 29922 1726853671.99658: Set connection var ansible_shell_type to sh 29922 1726853671.99694: variable 'ansible_shell_executable' from source: unknown 29922 1726853671.99702: variable 'ansible_connection' from source: unknown 29922 1726853671.99709: variable 'ansible_module_compression' from source: unknown 29922 1726853671.99714: variable 'ansible_shell_type' from source: unknown 29922 1726853671.99719: variable 'ansible_shell_executable' from source: unknown 29922 1726853671.99724: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853671.99729: variable 'ansible_pipelining' from source: unknown 29922 1726853671.99733: variable 'ansible_timeout' from source: unknown 29922 1726853671.99787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853671.99905: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853671.99925: variable 'omit' from source: magic vars 29922 1726853671.99936: starting attempt loop 29922 1726853671.99943: running the handler 29922 1726853671.99967: _low_level_execute_command(): starting 29922 1726853671.99980: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853672.00717: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853672.00734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853672.00751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853672.00785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853672.00890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853672.00906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853672.00927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853672.01022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853672.02766: stdout chunk (state=3): >>>/root <<< 29922 1726853672.02924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853672.02927: stdout chunk (state=3): >>><<< 29922 1726853672.02930: stderr chunk (state=3): >>><<< 29922 1726853672.02965: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853672.02990: _low_level_execute_command(): starting 29922 1726853672.03087: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853672.0297484-30948-238034381726486 `" && echo ansible-tmp-1726853672.0297484-30948-238034381726486="` echo /root/.ansible/tmp/ansible-tmp-1726853672.0297484-30948-238034381726486 `" ) && sleep 0' 29922 1726853672.03719: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853672.03732: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853672.03766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853672.03881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853672.03931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853672.03999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853672.05966: stdout chunk (state=3): >>>ansible-tmp-1726853672.0297484-30948-238034381726486=/root/.ansible/tmp/ansible-tmp-1726853672.0297484-30948-238034381726486 <<< 29922 1726853672.06118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853672.06279: stdout chunk (state=3): >>><<< 29922 1726853672.06282: stderr chunk (state=3): >>><<< 29922 1726853672.06285: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853672.0297484-30948-238034381726486=/root/.ansible/tmp/ansible-tmp-1726853672.0297484-30948-238034381726486 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853672.06288: variable 'ansible_module_compression' from source: unknown 29922 1726853672.06290: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29922 1726853672.06314: variable 'ansible_facts' from source: unknown 29922 1726853672.06410: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853672.0297484-30948-238034381726486/AnsiballZ_command.py 29922 1726853672.06603: Sending initial data 29922 1726853672.06606: Sent initial data (156 bytes) 29922 1726853672.07278: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853672.07319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853672.07405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853672.09018: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853672.09099: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853672.09174: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpjj_201x3 /root/.ansible/tmp/ansible-tmp-1726853672.0297484-30948-238034381726486/AnsiballZ_command.py <<< 29922 1726853672.09177: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853672.0297484-30948-238034381726486/AnsiballZ_command.py" <<< 29922 1726853672.09238: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpjj_201x3" to remote "/root/.ansible/tmp/ansible-tmp-1726853672.0297484-30948-238034381726486/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853672.0297484-30948-238034381726486/AnsiballZ_command.py" <<< 29922 1726853672.10043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853672.10172: stderr chunk (state=3): >>><<< 29922 1726853672.10177: stdout chunk (state=3): >>><<< 29922 1726853672.10179: done transferring module to remote 29922 1726853672.10181: _low_level_execute_command(): starting 29922 1726853672.10184: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853672.0297484-30948-238034381726486/ /root/.ansible/tmp/ansible-tmp-1726853672.0297484-30948-238034381726486/AnsiballZ_command.py && sleep 0' 29922 1726853672.10748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853672.10799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853672.10810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853672.10833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853672.10929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853672.12829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853672.12846: stdout chunk (state=3): >>><<< 29922 1726853672.12862: stderr chunk (state=3): >>><<< 29922 1726853672.12888: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853672.12902: _low_level_execute_command(): starting 29922 1726853672.12912: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853672.0297484-30948-238034381726486/AnsiballZ_command.py && sleep 0' 29922 1726853672.13560: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853672.13582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853672.13603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853672.13632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853672.13688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853672.13749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853672.13772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853672.13797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853672.13906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853672.31044: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-20 13:34:32.291888", "end": "2024-09-20 13:34:32.308927", "delta": "0:00:00.017039", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv4.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853672.32787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853672.32792: stdout chunk (state=3): >>><<< 29922 1726853672.32795: stderr chunk (state=3): >>><<< 29922 1726853672.32799: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-20 13:34:32.291888", "end": "2024-09-20 13:34:32.308927", "delta": "0:00:00.017039", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv4.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853672.32802: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f ipv4.routing-rules c show "ethtest0"', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853672.0297484-30948-238034381726486/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853672.32805: _low_level_execute_command(): starting 29922 1726853672.32808: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853672.0297484-30948-238034381726486/ > /dev/null 2>&1 && sleep 0' 29922 1726853672.33437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853672.33452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853672.33474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853672.33495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853672.33598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853672.33623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853672.33714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853672.35618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853672.35622: stdout chunk (state=3): >>><<< 29922 1726853672.35628: stderr chunk (state=3): >>><<< 29922 1726853672.35646: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853672.35652: handler run complete 29922 1726853672.35680: Evaluated conditional (False): False 29922 1726853672.35776: attempt loop complete, returning result 29922 1726853672.35779: _execute() done 29922 1726853672.35781: dumping result to json 29922 1726853672.35783: done dumping result, returning 29922 1726853672.35784: done running TaskExecutor() for managed_node3/TASK: Get the IPv4 routing rule for the connection "ethtest0" [02083763-bbaf-51d4-513b-000000000060] 29922 1726853672.35786: sending task result for task 02083763-bbaf-51d4-513b-000000000060 ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0" ], "delta": "0:00:00.017039", "end": "2024-09-20 13:34:32.308927", "rc": 0, "start": "2024-09-20 13:34:32.291888" } STDOUT: ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200 29922 1726853672.35927: no more pending results, returning what we have 29922 1726853672.35974: results queue empty 29922 1726853672.35976: checking for any_errors_fatal 29922 1726853672.35985: done checking for any_errors_fatal 29922 1726853672.35986: checking for max_fail_percentage 29922 1726853672.35988: done checking for max_fail_percentage 29922 1726853672.35989: checking to see if all hosts have failed and the running result is not ok 29922 1726853672.35990: done checking to see if all hosts have failed 29922 1726853672.35991: getting the remaining hosts for this loop 29922 1726853672.35992: done getting the remaining hosts for this loop 29922 1726853672.35996: getting the next task for host managed_node3 29922 1726853672.36002: done getting next task for host managed_node3 29922 1726853672.36005: ^ task is: TASK: Get the IPv6 routing rule for the connection "{{ interface }}" 29922 1726853672.36007: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853672.36012: getting variables 29922 1726853672.36013: in VariableManager get_vars() 29922 1726853672.36185: Calling all_inventory to load vars for managed_node3 29922 1726853672.36188: Calling groups_inventory to load vars for managed_node3 29922 1726853672.36190: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853672.36202: Calling all_plugins_play to load vars for managed_node3 29922 1726853672.36205: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853672.36209: Calling groups_plugins_play to load vars for managed_node3 29922 1726853672.36877: done sending task result for task 02083763-bbaf-51d4-513b-000000000060 29922 1726853672.36881: WORKER PROCESS EXITING 29922 1726853672.37819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853672.39381: done with get_vars() 29922 1726853672.39406: done getting variables 29922 1726853672.39474: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853672.39597: variable 'interface' from source: set_fact TASK [Get the IPv6 routing rule for the connection "ethtest0"] ***************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:149 Friday 20 September 2024 13:34:32 -0400 (0:00:00.420) 0:00:21.325 ****** 29922 1726853672.39627: entering _queue_task() for managed_node3/command 29922 1726853672.39985: worker is 1 (out of 1 available) 29922 1726853672.39999: exiting _queue_task() for managed_node3/command 29922 1726853672.40011: done queuing things up, now waiting for results queue to drain 29922 1726853672.40013: waiting for pending results... 29922 1726853672.40392: running TaskExecutor() for managed_node3/TASK: Get the IPv6 routing rule for the connection "ethtest0" 29922 1726853672.40432: in run() - task 02083763-bbaf-51d4-513b-000000000061 29922 1726853672.40453: variable 'ansible_search_path' from source: unknown 29922 1726853672.40502: calling self._execute() 29922 1726853672.40619: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.40634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.40651: variable 'omit' from source: magic vars 29922 1726853672.41054: variable 'ansible_distribution_major_version' from source: facts 29922 1726853672.41076: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853672.41088: variable 'omit' from source: magic vars 29922 1726853672.41111: variable 'omit' from source: magic vars 29922 1726853672.41207: variable 'interface' from source: set_fact 29922 1726853672.41230: variable 'omit' from source: magic vars 29922 1726853672.41285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853672.41320: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853672.41343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853672.41372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853672.41387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853672.41418: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853672.41424: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.41430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.41575: Set connection var ansible_connection to ssh 29922 1726853672.41579: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853672.41581: Set connection var ansible_shell_executable to /bin/sh 29922 1726853672.41583: Set connection var ansible_pipelining to False 29922 1726853672.41585: Set connection var ansible_timeout to 10 29922 1726853672.41587: Set connection var ansible_shell_type to sh 29922 1726853672.41611: variable 'ansible_shell_executable' from source: unknown 29922 1726853672.41618: variable 'ansible_connection' from source: unknown 29922 1726853672.41625: variable 'ansible_module_compression' from source: unknown 29922 1726853672.41631: variable 'ansible_shell_type' from source: unknown 29922 1726853672.41637: variable 'ansible_shell_executable' from source: unknown 29922 1726853672.41643: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.41701: variable 'ansible_pipelining' from source: unknown 29922 1726853672.41704: variable 'ansible_timeout' from source: unknown 29922 1726853672.41706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.41829: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853672.41845: variable 'omit' from source: magic vars 29922 1726853672.41857: starting attempt loop 29922 1726853672.41864: running the handler 29922 1726853672.41884: _low_level_execute_command(): starting 29922 1726853672.41895: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853672.42634: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853672.42653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853672.42774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853672.42795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853672.42808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853672.42912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853672.44634: stdout chunk (state=3): >>>/root <<< 29922 1726853672.44783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853672.44790: stdout chunk (state=3): >>><<< 29922 1726853672.44810: stderr chunk (state=3): >>><<< 29922 1726853672.44931: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853672.44936: _low_level_execute_command(): starting 29922 1726853672.44939: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853672.4483354-30961-4300301948001 `" && echo ansible-tmp-1726853672.4483354-30961-4300301948001="` echo /root/.ansible/tmp/ansible-tmp-1726853672.4483354-30961-4300301948001 `" ) && sleep 0' 29922 1726853672.45507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853672.45520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853672.45535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853672.45558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853672.45587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853672.45622: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853672.45685: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853672.45728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853672.45745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853672.45773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853672.45867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853672.47859: stdout chunk (state=3): >>>ansible-tmp-1726853672.4483354-30961-4300301948001=/root/.ansible/tmp/ansible-tmp-1726853672.4483354-30961-4300301948001 <<< 29922 1726853672.48022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853672.48026: stdout chunk (state=3): >>><<< 29922 1726853672.48028: stderr chunk (state=3): >>><<< 29922 1726853672.48176: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853672.4483354-30961-4300301948001=/root/.ansible/tmp/ansible-tmp-1726853672.4483354-30961-4300301948001 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853672.48180: variable 'ansible_module_compression' from source: unknown 29922 1726853672.48182: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29922 1726853672.48199: variable 'ansible_facts' from source: unknown 29922 1726853672.48293: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853672.4483354-30961-4300301948001/AnsiballZ_command.py 29922 1726853672.48815: Sending initial data 29922 1726853672.48818: Sent initial data (154 bytes) 29922 1726853672.50016: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853672.50225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853672.50291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853672.50318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853672.50536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853672.52305: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853672.52431: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853672.52524: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpb81yyln0 /root/.ansible/tmp/ansible-tmp-1726853672.4483354-30961-4300301948001/AnsiballZ_command.py <<< 29922 1726853672.52537: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853672.4483354-30961-4300301948001/AnsiballZ_command.py" <<< 29922 1726853672.52629: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpb81yyln0" to remote "/root/.ansible/tmp/ansible-tmp-1726853672.4483354-30961-4300301948001/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853672.4483354-30961-4300301948001/AnsiballZ_command.py" <<< 29922 1726853672.53920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853672.53944: stderr chunk (state=3): >>><<< 29922 1726853672.53952: stdout chunk (state=3): >>><<< 29922 1726853672.53991: done transferring module to remote 29922 1726853672.54005: _low_level_execute_command(): starting 29922 1726853672.54012: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853672.4483354-30961-4300301948001/ /root/.ansible/tmp/ansible-tmp-1726853672.4483354-30961-4300301948001/AnsiballZ_command.py && sleep 0' 29922 1726853672.54614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853672.54629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853672.54645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853672.54708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853672.54780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853672.54806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853672.54845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853672.54904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853672.56877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853672.56881: stdout chunk (state=3): >>><<< 29922 1726853672.56884: stderr chunk (state=3): >>><<< 29922 1726853672.56886: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853672.56889: _low_level_execute_command(): starting 29922 1726853672.56892: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853672.4483354-30961-4300301948001/AnsiballZ_command.py && sleep 0' 29922 1726853672.57437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853672.57446: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853672.57459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853672.57470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853672.57489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853672.57522: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853672.57529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853672.57601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853672.57640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853672.57711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853672.74998: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-20 13:34:32.731757", "end": "2024-09-20 13:34:32.748608", "delta": "0:00:00.016851", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv6.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853672.76629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853672.76634: stdout chunk (state=3): >>><<< 29922 1726853672.76639: stderr chunk (state=3): >>><<< 29922 1726853672.76659: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-20 13:34:32.731757", "end": "2024-09-20 13:34:32.748608", "delta": "0:00:00.016851", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv6.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853672.76687: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f ipv6.routing-rules c show "ethtest0"', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853672.4483354-30961-4300301948001/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853672.76694: _low_level_execute_command(): starting 29922 1726853672.76699: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853672.4483354-30961-4300301948001/ > /dev/null 2>&1 && sleep 0' 29922 1726853672.77139: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853672.77147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853672.77150: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853672.77154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853672.77156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853672.77201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853672.77204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853672.77268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853672.79128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853672.79148: stderr chunk (state=3): >>><<< 29922 1726853672.79152: stdout chunk (state=3): >>><<< 29922 1726853672.79169: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853672.79176: handler run complete 29922 1726853672.79194: Evaluated conditional (False): False 29922 1726853672.79202: attempt loop complete, returning result 29922 1726853672.79209: _execute() done 29922 1726853672.79211: dumping result to json 29922 1726853672.79217: done dumping result, returning 29922 1726853672.79225: done running TaskExecutor() for managed_node3/TASK: Get the IPv6 routing rule for the connection "ethtest0" [02083763-bbaf-51d4-513b-000000000061] 29922 1726853672.79228: sending task result for task 02083763-bbaf-51d4-513b-000000000061 29922 1726853672.79318: done sending task result for task 02083763-bbaf-51d4-513b-000000000061 29922 1726853672.79322: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0" ], "delta": "0:00:00.016851", "end": "2024-09-20 13:34:32.748608", "rc": 0, "start": "2024-09-20 13:34:32.731757" } STDOUT: ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600 29922 1726853672.79390: no more pending results, returning what we have 29922 1726853672.79394: results queue empty 29922 1726853672.79395: checking for any_errors_fatal 29922 1726853672.79403: done checking for any_errors_fatal 29922 1726853672.79404: checking for max_fail_percentage 29922 1726853672.79405: done checking for max_fail_percentage 29922 1726853672.79406: checking to see if all hosts have failed and the running result is not ok 29922 1726853672.79407: done checking to see if all hosts have failed 29922 1726853672.79407: getting the remaining hosts for this loop 29922 1726853672.79409: done getting the remaining hosts for this loop 29922 1726853672.79412: getting the next task for host managed_node3 29922 1726853672.79418: done getting next task for host managed_node3 29922 1726853672.79420: ^ task is: TASK: Assert that the routing rule with table lookup 30200 matches the specified rule 29922 1726853672.79422: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853672.79426: getting variables 29922 1726853672.79427: in VariableManager get_vars() 29922 1726853672.79464: Calling all_inventory to load vars for managed_node3 29922 1726853672.79467: Calling groups_inventory to load vars for managed_node3 29922 1726853672.79469: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853672.79487: Calling all_plugins_play to load vars for managed_node3 29922 1726853672.79490: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853672.79493: Calling groups_plugins_play to load vars for managed_node3 29922 1726853672.80297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853672.81162: done with get_vars() 29922 1726853672.81180: done getting variables 29922 1726853672.81226: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30200 matches the specified rule] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:155 Friday 20 September 2024 13:34:32 -0400 (0:00:00.416) 0:00:21.742 ****** 29922 1726853672.81247: entering _queue_task() for managed_node3/assert 29922 1726853672.81498: worker is 1 (out of 1 available) 29922 1726853672.81512: exiting _queue_task() for managed_node3/assert 29922 1726853672.81526: done queuing things up, now waiting for results queue to drain 29922 1726853672.81527: waiting for pending results... 29922 1726853672.81708: running TaskExecutor() for managed_node3/TASK: Assert that the routing rule with table lookup 30200 matches the specified rule 29922 1726853672.81783: in run() - task 02083763-bbaf-51d4-513b-000000000062 29922 1726853672.81795: variable 'ansible_search_path' from source: unknown 29922 1726853672.81823: calling self._execute() 29922 1726853672.81906: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.81912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.81921: variable 'omit' from source: magic vars 29922 1726853672.82208: variable 'ansible_distribution_major_version' from source: facts 29922 1726853672.82217: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853672.82299: variable 'ansible_distribution_major_version' from source: facts 29922 1726853672.82303: Evaluated conditional (ansible_distribution_major_version != "7"): True 29922 1726853672.82309: variable 'omit' from source: magic vars 29922 1726853672.82325: variable 'omit' from source: magic vars 29922 1726853672.82352: variable 'omit' from source: magic vars 29922 1726853672.82388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853672.82418: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853672.82433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853672.82446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853672.82456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853672.82483: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853672.82486: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.82488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.82557: Set connection var ansible_connection to ssh 29922 1726853672.82566: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853672.82575: Set connection var ansible_shell_executable to /bin/sh 29922 1726853672.82581: Set connection var ansible_pipelining to False 29922 1726853672.82587: Set connection var ansible_timeout to 10 29922 1726853672.82589: Set connection var ansible_shell_type to sh 29922 1726853672.82606: variable 'ansible_shell_executable' from source: unknown 29922 1726853672.82609: variable 'ansible_connection' from source: unknown 29922 1726853672.82611: variable 'ansible_module_compression' from source: unknown 29922 1726853672.82614: variable 'ansible_shell_type' from source: unknown 29922 1726853672.82617: variable 'ansible_shell_executable' from source: unknown 29922 1726853672.82620: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.82622: variable 'ansible_pipelining' from source: unknown 29922 1726853672.82624: variable 'ansible_timeout' from source: unknown 29922 1726853672.82636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.82746: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853672.82749: variable 'omit' from source: magic vars 29922 1726853672.82752: starting attempt loop 29922 1726853672.82754: running the handler 29922 1726853672.82866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853672.83033: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853672.83069: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853672.83120: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853672.83145: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853672.83214: variable 'route_rule_table_30200' from source: set_fact 29922 1726853672.83236: Evaluated conditional (route_rule_table_30200.stdout is search("30200:(\s+)from 198.51.100.58/26 lookup 30200")): True 29922 1726853672.83330: variable 'route_rule_table_30200' from source: set_fact 29922 1726853672.83350: Evaluated conditional (route_rule_table_30200.stdout is search("30201:(\s+)from all fwmark 0x1/0x1 lookup 30200")): True 29922 1726853672.83442: variable 'route_rule_table_30200' from source: set_fact 29922 1726853672.83463: Evaluated conditional (route_rule_table_30200.stdout is search("30202:(\s+)from all ipproto tcp lookup 30200")): True 29922 1726853672.83552: variable 'route_rule_table_30200' from source: set_fact 29922 1726853672.83576: Evaluated conditional (route_rule_table_30200.stdout is search("30203:(\s+)from all sport 128-256 lookup 30200")): True 29922 1726853672.83664: variable 'route_rule_table_30200' from source: set_fact 29922 1726853672.83687: Evaluated conditional (route_rule_table_30200.stdout is search("30204:(\s+)from all tos (0x08|throughput) lookup 30200")): True 29922 1726853672.83692: handler run complete 29922 1726853672.83704: attempt loop complete, returning result 29922 1726853672.83707: _execute() done 29922 1726853672.83710: dumping result to json 29922 1726853672.83712: done dumping result, returning 29922 1726853672.83723: done running TaskExecutor() for managed_node3/TASK: Assert that the routing rule with table lookup 30200 matches the specified rule [02083763-bbaf-51d4-513b-000000000062] 29922 1726853672.83725: sending task result for task 02083763-bbaf-51d4-513b-000000000062 29922 1726853672.83803: done sending task result for task 02083763-bbaf-51d4-513b-000000000062 29922 1726853672.83806: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 29922 1726853672.83873: no more pending results, returning what we have 29922 1726853672.83877: results queue empty 29922 1726853672.83877: checking for any_errors_fatal 29922 1726853672.83888: done checking for any_errors_fatal 29922 1726853672.83889: checking for max_fail_percentage 29922 1726853672.83890: done checking for max_fail_percentage 29922 1726853672.83891: checking to see if all hosts have failed and the running result is not ok 29922 1726853672.83892: done checking to see if all hosts have failed 29922 1726853672.83893: getting the remaining hosts for this loop 29922 1726853672.83894: done getting the remaining hosts for this loop 29922 1726853672.83897: getting the next task for host managed_node3 29922 1726853672.83903: done getting next task for host managed_node3 29922 1726853672.83905: ^ task is: TASK: Assert that the routing rule with table lookup 30400 matches the specified rule 29922 1726853672.83907: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853672.83911: getting variables 29922 1726853672.83912: in VariableManager get_vars() 29922 1726853672.83949: Calling all_inventory to load vars for managed_node3 29922 1726853672.83951: Calling groups_inventory to load vars for managed_node3 29922 1726853672.83953: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853672.83969: Calling all_plugins_play to load vars for managed_node3 29922 1726853672.83973: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853672.83976: Calling groups_plugins_play to load vars for managed_node3 29922 1726853672.84873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853672.85736: done with get_vars() 29922 1726853672.85751: done getting variables 29922 1726853672.85796: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30400 matches the specified rule] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:166 Friday 20 September 2024 13:34:32 -0400 (0:00:00.045) 0:00:21.787 ****** 29922 1726853672.85818: entering _queue_task() for managed_node3/assert 29922 1726853672.86073: worker is 1 (out of 1 available) 29922 1726853672.86088: exiting _queue_task() for managed_node3/assert 29922 1726853672.86102: done queuing things up, now waiting for results queue to drain 29922 1726853672.86103: waiting for pending results... 29922 1726853672.86289: running TaskExecutor() for managed_node3/TASK: Assert that the routing rule with table lookup 30400 matches the specified rule 29922 1726853672.86359: in run() - task 02083763-bbaf-51d4-513b-000000000063 29922 1726853672.86373: variable 'ansible_search_path' from source: unknown 29922 1726853672.86410: calling self._execute() 29922 1726853672.86493: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.86497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.86507: variable 'omit' from source: magic vars 29922 1726853672.86796: variable 'ansible_distribution_major_version' from source: facts 29922 1726853672.86806: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853672.86885: variable 'ansible_distribution_major_version' from source: facts 29922 1726853672.86888: Evaluated conditional (ansible_distribution_major_version != "7"): True 29922 1726853672.86896: variable 'omit' from source: magic vars 29922 1726853672.86912: variable 'omit' from source: magic vars 29922 1726853672.86940: variable 'omit' from source: magic vars 29922 1726853672.86975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853672.87006: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853672.87022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853672.87036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853672.87046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853672.87070: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853672.87075: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.87077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.87146: Set connection var ansible_connection to ssh 29922 1726853672.87152: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853672.87160: Set connection var ansible_shell_executable to /bin/sh 29922 1726853672.87167: Set connection var ansible_pipelining to False 29922 1726853672.87173: Set connection var ansible_timeout to 10 29922 1726853672.87176: Set connection var ansible_shell_type to sh 29922 1726853672.87194: variable 'ansible_shell_executable' from source: unknown 29922 1726853672.87199: variable 'ansible_connection' from source: unknown 29922 1726853672.87202: variable 'ansible_module_compression' from source: unknown 29922 1726853672.87204: variable 'ansible_shell_type' from source: unknown 29922 1726853672.87206: variable 'ansible_shell_executable' from source: unknown 29922 1726853672.87208: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.87210: variable 'ansible_pipelining' from source: unknown 29922 1726853672.87212: variable 'ansible_timeout' from source: unknown 29922 1726853672.87214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.87317: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853672.87331: variable 'omit' from source: magic vars 29922 1726853672.87334: starting attempt loop 29922 1726853672.87336: running the handler 29922 1726853672.87443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853672.87615: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853672.87643: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853672.87704: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853672.87728: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853672.87796: variable 'route_rule_table_30400' from source: set_fact 29922 1726853672.87818: Evaluated conditional (route_rule_table_30400.stdout is search("30400:(\s+)from all to 198.51.100.128/26 lookup 30400")): True 29922 1726853672.87913: variable 'route_rule_table_30400' from source: set_fact 29922 1726853672.87932: Evaluated conditional (route_rule_table_30400.stdout is search("30401:(\s+)from all iif iiftest \[detached\] lookup 30400")): True 29922 1726853672.88022: variable 'route_rule_table_30400' from source: set_fact 29922 1726853672.88041: Evaluated conditional (route_rule_table_30400.stdout is search("30402:(\s+)from all oif oiftest \[detached\] lookup 30400")): True 29922 1726853672.88045: handler run complete 29922 1726853672.88059: attempt loop complete, returning result 29922 1726853672.88062: _execute() done 29922 1726853672.88065: dumping result to json 29922 1726853672.88067: done dumping result, returning 29922 1726853672.88073: done running TaskExecutor() for managed_node3/TASK: Assert that the routing rule with table lookup 30400 matches the specified rule [02083763-bbaf-51d4-513b-000000000063] 29922 1726853672.88076: sending task result for task 02083763-bbaf-51d4-513b-000000000063 29922 1726853672.88159: done sending task result for task 02083763-bbaf-51d4-513b-000000000063 29922 1726853672.88162: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 29922 1726853672.88239: no more pending results, returning what we have 29922 1726853672.88242: results queue empty 29922 1726853672.88243: checking for any_errors_fatal 29922 1726853672.88253: done checking for any_errors_fatal 29922 1726853672.88254: checking for max_fail_percentage 29922 1726853672.88258: done checking for max_fail_percentage 29922 1726853672.88259: checking to see if all hosts have failed and the running result is not ok 29922 1726853672.88260: done checking to see if all hosts have failed 29922 1726853672.88260: getting the remaining hosts for this loop 29922 1726853672.88262: done getting the remaining hosts for this loop 29922 1726853672.88265: getting the next task for host managed_node3 29922 1726853672.88270: done getting next task for host managed_node3 29922 1726853672.88274: ^ task is: TASK: Assert that the routing rule with table lookup 30600 matches the specified rule 29922 1726853672.88276: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853672.88280: getting variables 29922 1726853672.88281: in VariableManager get_vars() 29922 1726853672.88317: Calling all_inventory to load vars for managed_node3 29922 1726853672.88319: Calling groups_inventory to load vars for managed_node3 29922 1726853672.88321: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853672.88331: Calling all_plugins_play to load vars for managed_node3 29922 1726853672.88333: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853672.88336: Calling groups_plugins_play to load vars for managed_node3 29922 1726853672.89157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853672.90140: done with get_vars() 29922 1726853672.90158: done getting variables 29922 1726853672.90202: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30600 matches the specified rule] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:175 Friday 20 September 2024 13:34:32 -0400 (0:00:00.044) 0:00:21.831 ****** 29922 1726853672.90224: entering _queue_task() for managed_node3/assert 29922 1726853672.90479: worker is 1 (out of 1 available) 29922 1726853672.90494: exiting _queue_task() for managed_node3/assert 29922 1726853672.90506: done queuing things up, now waiting for results queue to drain 29922 1726853672.90508: waiting for pending results... 29922 1726853672.90687: running TaskExecutor() for managed_node3/TASK: Assert that the routing rule with table lookup 30600 matches the specified rule 29922 1726853672.90748: in run() - task 02083763-bbaf-51d4-513b-000000000064 29922 1726853672.90761: variable 'ansible_search_path' from source: unknown 29922 1726853672.90791: calling self._execute() 29922 1726853672.90874: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.90879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.90888: variable 'omit' from source: magic vars 29922 1726853672.91178: variable 'ansible_distribution_major_version' from source: facts 29922 1726853672.91188: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853672.91262: variable 'ansible_distribution_major_version' from source: facts 29922 1726853672.91265: Evaluated conditional (ansible_distribution_major_version != "7"): True 29922 1726853672.91274: variable 'omit' from source: magic vars 29922 1726853672.91293: variable 'omit' from source: magic vars 29922 1726853672.91323: variable 'omit' from source: magic vars 29922 1726853672.91357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853672.91385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853672.91401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853672.91415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853672.91424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853672.91448: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853672.91452: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.91454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.91523: Set connection var ansible_connection to ssh 29922 1726853672.91530: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853672.91537: Set connection var ansible_shell_executable to /bin/sh 29922 1726853672.91543: Set connection var ansible_pipelining to False 29922 1726853672.91548: Set connection var ansible_timeout to 10 29922 1726853672.91551: Set connection var ansible_shell_type to sh 29922 1726853672.91569: variable 'ansible_shell_executable' from source: unknown 29922 1726853672.91574: variable 'ansible_connection' from source: unknown 29922 1726853672.91577: variable 'ansible_module_compression' from source: unknown 29922 1726853672.91579: variable 'ansible_shell_type' from source: unknown 29922 1726853672.91581: variable 'ansible_shell_executable' from source: unknown 29922 1726853672.91583: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.91588: variable 'ansible_pipelining' from source: unknown 29922 1726853672.91590: variable 'ansible_timeout' from source: unknown 29922 1726853672.91594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.91695: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853672.91705: variable 'omit' from source: magic vars 29922 1726853672.91712: starting attempt loop 29922 1726853672.91715: running the handler 29922 1726853672.91822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853672.91994: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853672.92025: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853672.92085: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853672.92110: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853672.92175: variable 'route_rule_table_30600' from source: set_fact 29922 1726853672.92198: Evaluated conditional (route_rule_table_30600.stdout is search("30600:(\s+)from all to 2001:db8::4/32 lookup 30600")): True 29922 1726853672.92291: variable 'route_rule_table_30600' from source: set_fact 29922 1726853672.92310: Evaluated conditional (route_rule_table_30600.stdout is search("30601:(\s+)not from all dport 128-256 lookup 30600")): True 29922 1726853672.92316: handler run complete 29922 1726853672.92326: attempt loop complete, returning result 29922 1726853672.92329: _execute() done 29922 1726853672.92331: dumping result to json 29922 1726853672.92334: done dumping result, returning 29922 1726853672.92341: done running TaskExecutor() for managed_node3/TASK: Assert that the routing rule with table lookup 30600 matches the specified rule [02083763-bbaf-51d4-513b-000000000064] 29922 1726853672.92343: sending task result for task 02083763-bbaf-51d4-513b-000000000064 29922 1726853672.92424: done sending task result for task 02083763-bbaf-51d4-513b-000000000064 29922 1726853672.92427: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 29922 1726853672.92476: no more pending results, returning what we have 29922 1726853672.92480: results queue empty 29922 1726853672.92481: checking for any_errors_fatal 29922 1726853672.92488: done checking for any_errors_fatal 29922 1726853672.92488: checking for max_fail_percentage 29922 1726853672.92490: done checking for max_fail_percentage 29922 1726853672.92491: checking to see if all hosts have failed and the running result is not ok 29922 1726853672.92491: done checking to see if all hosts have failed 29922 1726853672.92492: getting the remaining hosts for this loop 29922 1726853672.92493: done getting the remaining hosts for this loop 29922 1726853672.92496: getting the next task for host managed_node3 29922 1726853672.92502: done getting next task for host managed_node3 29922 1726853672.92504: ^ task is: TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule 29922 1726853672.92506: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853672.92510: getting variables 29922 1726853672.92511: in VariableManager get_vars() 29922 1726853672.92550: Calling all_inventory to load vars for managed_node3 29922 1726853672.92552: Calling groups_inventory to load vars for managed_node3 29922 1726853672.92556: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853672.92567: Calling all_plugins_play to load vars for managed_node3 29922 1726853672.92569: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853672.92573: Calling groups_plugins_play to load vars for managed_node3 29922 1726853672.93388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853672.94259: done with get_vars() 29922 1726853672.94277: done getting variables 29922 1726853672.94322: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with 'custom' table lookup matches the specified rule] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:183 Friday 20 September 2024 13:34:32 -0400 (0:00:00.041) 0:00:21.873 ****** 29922 1726853672.94346: entering _queue_task() for managed_node3/assert 29922 1726853672.94595: worker is 1 (out of 1 available) 29922 1726853672.94609: exiting _queue_task() for managed_node3/assert 29922 1726853672.94620: done queuing things up, now waiting for results queue to drain 29922 1726853672.94622: waiting for pending results... 29922 1726853672.94804: running TaskExecutor() for managed_node3/TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule 29922 1726853672.94870: in run() - task 02083763-bbaf-51d4-513b-000000000065 29922 1726853672.94886: variable 'ansible_search_path' from source: unknown 29922 1726853672.94915: calling self._execute() 29922 1726853672.94997: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.95003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.95013: variable 'omit' from source: magic vars 29922 1726853672.95292: variable 'ansible_distribution_major_version' from source: facts 29922 1726853672.95303: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853672.95380: variable 'ansible_distribution_major_version' from source: facts 29922 1726853672.95383: Evaluated conditional (ansible_distribution_major_version != "7"): True 29922 1726853672.95395: variable 'omit' from source: magic vars 29922 1726853672.95409: variable 'omit' from source: magic vars 29922 1726853672.95436: variable 'omit' from source: magic vars 29922 1726853672.95469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853672.95498: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853672.95516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853672.95528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853672.95537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853672.95562: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853672.95565: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.95567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.95636: Set connection var ansible_connection to ssh 29922 1726853672.95643: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853672.95650: Set connection var ansible_shell_executable to /bin/sh 29922 1726853672.95659: Set connection var ansible_pipelining to False 29922 1726853672.95662: Set connection var ansible_timeout to 10 29922 1726853672.95664: Set connection var ansible_shell_type to sh 29922 1726853672.95684: variable 'ansible_shell_executable' from source: unknown 29922 1726853672.95687: variable 'ansible_connection' from source: unknown 29922 1726853672.95689: variable 'ansible_module_compression' from source: unknown 29922 1726853672.95692: variable 'ansible_shell_type' from source: unknown 29922 1726853672.95694: variable 'ansible_shell_executable' from source: unknown 29922 1726853672.95696: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.95700: variable 'ansible_pipelining' from source: unknown 29922 1726853672.95702: variable 'ansible_timeout' from source: unknown 29922 1726853672.95706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.95807: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853672.95816: variable 'omit' from source: magic vars 29922 1726853672.95822: starting attempt loop 29922 1726853672.95827: running the handler 29922 1726853672.95934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853672.96100: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853672.96132: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853672.96188: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853672.96212: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853672.96276: variable 'route_rule_table_custom' from source: set_fact 29922 1726853672.96298: Evaluated conditional (route_rule_table_custom.stdout is search("200:(\s+)from 198.51.100.56/26 lookup custom")): True 29922 1726853672.96301: handler run complete 29922 1726853672.96313: attempt loop complete, returning result 29922 1726853672.96316: _execute() done 29922 1726853672.96318: dumping result to json 29922 1726853672.96321: done dumping result, returning 29922 1726853672.96327: done running TaskExecutor() for managed_node3/TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule [02083763-bbaf-51d4-513b-000000000065] 29922 1726853672.96332: sending task result for task 02083763-bbaf-51d4-513b-000000000065 29922 1726853672.96410: done sending task result for task 02083763-bbaf-51d4-513b-000000000065 29922 1726853672.96413: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 29922 1726853672.96464: no more pending results, returning what we have 29922 1726853672.96468: results queue empty 29922 1726853672.96469: checking for any_errors_fatal 29922 1726853672.96481: done checking for any_errors_fatal 29922 1726853672.96481: checking for max_fail_percentage 29922 1726853672.96483: done checking for max_fail_percentage 29922 1726853672.96484: checking to see if all hosts have failed and the running result is not ok 29922 1726853672.96485: done checking to see if all hosts have failed 29922 1726853672.96485: getting the remaining hosts for this loop 29922 1726853672.96486: done getting the remaining hosts for this loop 29922 1726853672.96490: getting the next task for host managed_node3 29922 1726853672.96496: done getting next task for host managed_node3 29922 1726853672.96499: ^ task is: TASK: Assert that the specified IPv4 routing rule was configured in the connection "{{ interface }}" 29922 1726853672.96501: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853672.96505: getting variables 29922 1726853672.96506: in VariableManager get_vars() 29922 1726853672.96549: Calling all_inventory to load vars for managed_node3 29922 1726853672.96551: Calling groups_inventory to load vars for managed_node3 29922 1726853672.96553: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853672.96566: Calling all_plugins_play to load vars for managed_node3 29922 1726853672.96569: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853672.96573: Calling groups_plugins_play to load vars for managed_node3 29922 1726853672.97539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853672.98399: done with get_vars() 29922 1726853672.98417: done getting variables 29922 1726853672.98462: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853672.98550: variable 'interface' from source: set_fact TASK [Assert that the specified IPv4 routing rule was configured in the connection "ethtest0"] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:190 Friday 20 September 2024 13:34:32 -0400 (0:00:00.042) 0:00:21.915 ****** 29922 1726853672.98581: entering _queue_task() for managed_node3/assert 29922 1726853672.98840: worker is 1 (out of 1 available) 29922 1726853672.98858: exiting _queue_task() for managed_node3/assert 29922 1726853672.98873: done queuing things up, now waiting for results queue to drain 29922 1726853672.98874: waiting for pending results... 29922 1726853672.99051: running TaskExecutor() for managed_node3/TASK: Assert that the specified IPv4 routing rule was configured in the connection "ethtest0" 29922 1726853672.99133: in run() - task 02083763-bbaf-51d4-513b-000000000066 29922 1726853672.99145: variable 'ansible_search_path' from source: unknown 29922 1726853672.99179: calling self._execute() 29922 1726853672.99263: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.99267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.99281: variable 'omit' from source: magic vars 29922 1726853672.99567: variable 'ansible_distribution_major_version' from source: facts 29922 1726853672.99579: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853672.99585: variable 'omit' from source: magic vars 29922 1726853672.99602: variable 'omit' from source: magic vars 29922 1726853672.99676: variable 'interface' from source: set_fact 29922 1726853672.99691: variable 'omit' from source: magic vars 29922 1726853672.99722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853672.99751: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853672.99773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853672.99787: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853672.99796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853672.99820: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853672.99823: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.99825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853672.99898: Set connection var ansible_connection to ssh 29922 1726853672.99905: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853672.99912: Set connection var ansible_shell_executable to /bin/sh 29922 1726853672.99919: Set connection var ansible_pipelining to False 29922 1726853672.99925: Set connection var ansible_timeout to 10 29922 1726853672.99928: Set connection var ansible_shell_type to sh 29922 1726853672.99944: variable 'ansible_shell_executable' from source: unknown 29922 1726853672.99948: variable 'ansible_connection' from source: unknown 29922 1726853672.99951: variable 'ansible_module_compression' from source: unknown 29922 1726853672.99953: variable 'ansible_shell_type' from source: unknown 29922 1726853672.99956: variable 'ansible_shell_executable' from source: unknown 29922 1726853672.99960: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853672.99965: variable 'ansible_pipelining' from source: unknown 29922 1726853672.99967: variable 'ansible_timeout' from source: unknown 29922 1726853672.99972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853673.00073: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853673.00085: variable 'omit' from source: magic vars 29922 1726853673.00089: starting attempt loop 29922 1726853673.00092: running the handler 29922 1726853673.00199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853673.00373: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853673.00402: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853673.00454: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853673.00483: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853673.00545: variable 'connection_route_rule' from source: set_fact 29922 1726853673.00568: Evaluated conditional (connection_route_rule.stdout is search("priority 30200 from 198.51.100.58/26 table 30200")): True 29922 1726853673.00664: variable 'connection_route_rule' from source: set_fact 29922 1726853673.00684: Evaluated conditional (connection_route_rule.stdout is search("priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200")): True 29922 1726853673.00773: variable 'connection_route_rule' from source: set_fact 29922 1726853673.00789: Evaluated conditional (connection_route_rule.stdout is search("priority 30202 from 0.0.0.0/0 ipproto 6 table 30200")): True 29922 1726853673.00877: variable 'connection_route_rule' from source: set_fact 29922 1726853673.00893: Evaluated conditional (connection_route_rule.stdout is search("priority 30203 from 0.0.0.0/0 sport 128-256 table 30200")): True 29922 1726853673.00980: variable 'connection_route_rule' from source: set_fact 29922 1726853673.00996: Evaluated conditional (connection_route_rule.stdout is search("priority 30204 from 0.0.0.0/0 tos 0x08 table 30200")): True 29922 1726853673.01082: variable 'connection_route_rule' from source: set_fact 29922 1726853673.01097: Evaluated conditional (connection_route_rule.stdout is search("priority 30400 to 198.51.100.128/26 table 30400")): True 29922 1726853673.01186: variable 'connection_route_rule' from source: set_fact 29922 1726853673.01200: Evaluated conditional (connection_route_rule.stdout is search("priority 30401 from 0.0.0.0/0 iif iiftest table 30400")): True 29922 1726853673.01285: variable 'connection_route_rule' from source: set_fact 29922 1726853673.01304: Evaluated conditional (connection_route_rule.stdout is search("priority 30402 from 0.0.0.0/0 oif oiftest table 30400")): True 29922 1726853673.01390: variable 'connection_route_rule' from source: set_fact 29922 1726853673.01408: Evaluated conditional (connection_route_rule.stdout is search("priority 30403 from 0.0.0.0/0 table 30400")): True 29922 1726853673.01492: variable 'connection_route_rule' from source: set_fact 29922 1726853673.01509: Evaluated conditional (connection_route_rule.stdout is search("priority 200 from 198.51.100.56/26 table 200")): True 29922 1726853673.01514: handler run complete 29922 1726853673.01525: attempt loop complete, returning result 29922 1726853673.01528: _execute() done 29922 1726853673.01530: dumping result to json 29922 1726853673.01533: done dumping result, returning 29922 1726853673.01540: done running TaskExecutor() for managed_node3/TASK: Assert that the specified IPv4 routing rule was configured in the connection "ethtest0" [02083763-bbaf-51d4-513b-000000000066] 29922 1726853673.01542: sending task result for task 02083763-bbaf-51d4-513b-000000000066 29922 1726853673.01627: done sending task result for task 02083763-bbaf-51d4-513b-000000000066 29922 1726853673.01629: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 29922 1726853673.01679: no more pending results, returning what we have 29922 1726853673.01682: results queue empty 29922 1726853673.01683: checking for any_errors_fatal 29922 1726853673.01691: done checking for any_errors_fatal 29922 1726853673.01692: checking for max_fail_percentage 29922 1726853673.01693: done checking for max_fail_percentage 29922 1726853673.01694: checking to see if all hosts have failed and the running result is not ok 29922 1726853673.01695: done checking to see if all hosts have failed 29922 1726853673.01701: getting the remaining hosts for this loop 29922 1726853673.01703: done getting the remaining hosts for this loop 29922 1726853673.01706: getting the next task for host managed_node3 29922 1726853673.01711: done getting next task for host managed_node3 29922 1726853673.01713: ^ task is: TASK: Assert that the specified IPv6 routing rule was configured in the connection "{{ interface }}" 29922 1726853673.01715: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853673.01718: getting variables 29922 1726853673.01720: in VariableManager get_vars() 29922 1726853673.01757: Calling all_inventory to load vars for managed_node3 29922 1726853673.01759: Calling groups_inventory to load vars for managed_node3 29922 1726853673.01762: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853673.01774: Calling all_plugins_play to load vars for managed_node3 29922 1726853673.01777: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853673.01780: Calling groups_plugins_play to load vars for managed_node3 29922 1726853673.02577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853673.03436: done with get_vars() 29922 1726853673.03452: done getting variables 29922 1726853673.03495: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853673.03578: variable 'interface' from source: set_fact TASK [Assert that the specified IPv6 routing rule was configured in the connection "ethtest0"] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:205 Friday 20 September 2024 13:34:33 -0400 (0:00:00.050) 0:00:21.965 ****** 29922 1726853673.03599: entering _queue_task() for managed_node3/assert 29922 1726853673.03834: worker is 1 (out of 1 available) 29922 1726853673.03849: exiting _queue_task() for managed_node3/assert 29922 1726853673.03860: done queuing things up, now waiting for results queue to drain 29922 1726853673.03862: waiting for pending results... 29922 1726853673.04044: running TaskExecutor() for managed_node3/TASK: Assert that the specified IPv6 routing rule was configured in the connection "ethtest0" 29922 1726853673.04114: in run() - task 02083763-bbaf-51d4-513b-000000000067 29922 1726853673.04126: variable 'ansible_search_path' from source: unknown 29922 1726853673.04155: calling self._execute() 29922 1726853673.04239: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853673.04243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853673.04253: variable 'omit' from source: magic vars 29922 1726853673.04528: variable 'ansible_distribution_major_version' from source: facts 29922 1726853673.04537: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853673.04543: variable 'omit' from source: magic vars 29922 1726853673.04563: variable 'omit' from source: magic vars 29922 1726853673.04634: variable 'interface' from source: set_fact 29922 1726853673.04649: variable 'omit' from source: magic vars 29922 1726853673.04685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853673.04711: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853673.04728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853673.04741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853673.04751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853673.04779: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853673.04783: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853673.04785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853673.04850: Set connection var ansible_connection to ssh 29922 1726853673.04858: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853673.04869: Set connection var ansible_shell_executable to /bin/sh 29922 1726853673.04876: Set connection var ansible_pipelining to False 29922 1726853673.04881: Set connection var ansible_timeout to 10 29922 1726853673.04884: Set connection var ansible_shell_type to sh 29922 1726853673.04902: variable 'ansible_shell_executable' from source: unknown 29922 1726853673.04905: variable 'ansible_connection' from source: unknown 29922 1726853673.04908: variable 'ansible_module_compression' from source: unknown 29922 1726853673.04910: variable 'ansible_shell_type' from source: unknown 29922 1726853673.04912: variable 'ansible_shell_executable' from source: unknown 29922 1726853673.04914: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853673.04919: variable 'ansible_pipelining' from source: unknown 29922 1726853673.04922: variable 'ansible_timeout' from source: unknown 29922 1726853673.04926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853673.05027: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853673.05037: variable 'omit' from source: magic vars 29922 1726853673.05043: starting attempt loop 29922 1726853673.05046: running the handler 29922 1726853673.05154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853673.05323: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853673.05353: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853673.05411: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853673.05438: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853673.05498: variable 'connection_route_rule6' from source: set_fact 29922 1726853673.05520: Evaluated conditional (connection_route_rule6.stdout is search("priority 30600 to 2001:db8::4/32 table 30600")): True 29922 1726853673.05641: variable 'connection_route_rule6' from source: set_fact 29922 1726853673.05662: Evaluated conditional (connection_route_rule6.stdout is search("priority 30601 not from ::/0 dport 128-256 table 30600") or connection_route_rule6.stdout is search("not priority 30601 from ::/0 dport 128-256 table 30600")): True 29922 1726853673.05748: variable 'connection_route_rule6' from source: set_fact 29922 1726853673.05766: Evaluated conditional (connection_route_rule6.stdout is search("priority 30602 from ::/0 table 30600")): True 29922 1726853673.05772: handler run complete 29922 1726853673.05783: attempt loop complete, returning result 29922 1726853673.05786: _execute() done 29922 1726853673.05789: dumping result to json 29922 1726853673.05791: done dumping result, returning 29922 1726853673.05798: done running TaskExecutor() for managed_node3/TASK: Assert that the specified IPv6 routing rule was configured in the connection "ethtest0" [02083763-bbaf-51d4-513b-000000000067] 29922 1726853673.05800: sending task result for task 02083763-bbaf-51d4-513b-000000000067 29922 1726853673.05881: done sending task result for task 02083763-bbaf-51d4-513b-000000000067 29922 1726853673.05884: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 29922 1726853673.05928: no more pending results, returning what we have 29922 1726853673.05931: results queue empty 29922 1726853673.05932: checking for any_errors_fatal 29922 1726853673.05945: done checking for any_errors_fatal 29922 1726853673.05945: checking for max_fail_percentage 29922 1726853673.05947: done checking for max_fail_percentage 29922 1726853673.05948: checking to see if all hosts have failed and the running result is not ok 29922 1726853673.05949: done checking to see if all hosts have failed 29922 1726853673.05949: getting the remaining hosts for this loop 29922 1726853673.05950: done getting the remaining hosts for this loop 29922 1726853673.05953: getting the next task for host managed_node3 29922 1726853673.05959: done getting next task for host managed_node3 29922 1726853673.05961: ^ task is: TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 29922 1726853673.05963: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853673.05968: getting variables 29922 1726853673.05969: in VariableManager get_vars() 29922 1726853673.06011: Calling all_inventory to load vars for managed_node3 29922 1726853673.06014: Calling groups_inventory to load vars for managed_node3 29922 1726853673.06016: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853673.06026: Calling all_plugins_play to load vars for managed_node3 29922 1726853673.06028: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853673.06031: Calling groups_plugins_play to load vars for managed_node3 29922 1726853673.06928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853673.07834: done with get_vars() 29922 1726853673.07848: done getting variables TASK [Remove the dedicated test file in `/etc/iproute2/rt_tables.d/`] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:213 Friday 20 September 2024 13:34:33 -0400 (0:00:00.043) 0:00:22.008 ****** 29922 1726853673.07915: entering _queue_task() for managed_node3/file 29922 1726853673.08140: worker is 1 (out of 1 available) 29922 1726853673.08154: exiting _queue_task() for managed_node3/file 29922 1726853673.08166: done queuing things up, now waiting for results queue to drain 29922 1726853673.08167: waiting for pending results... 29922 1726853673.08344: running TaskExecutor() for managed_node3/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 29922 1726853673.08403: in run() - task 02083763-bbaf-51d4-513b-000000000068 29922 1726853673.08414: variable 'ansible_search_path' from source: unknown 29922 1726853673.08443: calling self._execute() 29922 1726853673.08529: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853673.08533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853673.08542: variable 'omit' from source: magic vars 29922 1726853673.08820: variable 'ansible_distribution_major_version' from source: facts 29922 1726853673.08830: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853673.08838: variable 'omit' from source: magic vars 29922 1726853673.08855: variable 'omit' from source: magic vars 29922 1726853673.08885: variable 'omit' from source: magic vars 29922 1726853673.08917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853673.08946: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853673.08965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853673.08980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853673.08989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853673.09011: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853673.09014: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853673.09017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853673.09143: Set connection var ansible_connection to ssh 29922 1726853673.09147: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853673.09152: Set connection var ansible_shell_executable to /bin/sh 29922 1726853673.09154: Set connection var ansible_pipelining to False 29922 1726853673.09158: Set connection var ansible_timeout to 10 29922 1726853673.09160: Set connection var ansible_shell_type to sh 29922 1726853673.09183: variable 'ansible_shell_executable' from source: unknown 29922 1726853673.09185: variable 'ansible_connection' from source: unknown 29922 1726853673.09188: variable 'ansible_module_compression' from source: unknown 29922 1726853673.09191: variable 'ansible_shell_type' from source: unknown 29922 1726853673.09193: variable 'ansible_shell_executable' from source: unknown 29922 1726853673.09196: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853673.09198: variable 'ansible_pipelining' from source: unknown 29922 1726853673.09200: variable 'ansible_timeout' from source: unknown 29922 1726853673.09202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853673.09403: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853673.09408: variable 'omit' from source: magic vars 29922 1726853673.09411: starting attempt loop 29922 1726853673.09414: running the handler 29922 1726853673.09576: _low_level_execute_command(): starting 29922 1726853673.09579: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853673.10173: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853673.10191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853673.10209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853673.10308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853673.12007: stdout chunk (state=3): >>>/root <<< 29922 1726853673.12106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853673.12133: stderr chunk (state=3): >>><<< 29922 1726853673.12136: stdout chunk (state=3): >>><<< 29922 1726853673.12158: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853673.12202: _low_level_execute_command(): starting 29922 1726853673.12205: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853673.1215818-30991-81093764798909 `" && echo ansible-tmp-1726853673.1215818-30991-81093764798909="` echo /root/.ansible/tmp/ansible-tmp-1726853673.1215818-30991-81093764798909 `" ) && sleep 0' 29922 1726853673.12860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853673.12863: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853673.12865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853673.12867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853673.12869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853673.12876: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853673.12878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853673.12889: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853673.13078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853673.13081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853673.13083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853673.13160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853673.15140: stdout chunk (state=3): >>>ansible-tmp-1726853673.1215818-30991-81093764798909=/root/.ansible/tmp/ansible-tmp-1726853673.1215818-30991-81093764798909 <<< 29922 1726853673.15245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853673.15297: stderr chunk (state=3): >>><<< 29922 1726853673.15307: stdout chunk (state=3): >>><<< 29922 1726853673.15333: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853673.1215818-30991-81093764798909=/root/.ansible/tmp/ansible-tmp-1726853673.1215818-30991-81093764798909 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853673.15399: variable 'ansible_module_compression' from source: unknown 29922 1726853673.15474: ANSIBALLZ: Using lock for file 29922 1726853673.15483: ANSIBALLZ: Acquiring lock 29922 1726853673.15491: ANSIBALLZ: Lock acquired: 140376041362672 29922 1726853673.15498: ANSIBALLZ: Creating module 29922 1726853673.29113: ANSIBALLZ: Writing module into payload 29922 1726853673.29477: ANSIBALLZ: Writing module 29922 1726853673.29481: ANSIBALLZ: Renaming module 29922 1726853673.29483: ANSIBALLZ: Done creating module 29922 1726853673.29486: variable 'ansible_facts' from source: unknown 29922 1726853673.29488: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853673.1215818-30991-81093764798909/AnsiballZ_file.py 29922 1726853673.29626: Sending initial data 29922 1726853673.29629: Sent initial data (152 bytes) 29922 1726853673.30218: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853673.30268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853673.30286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853673.30298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853673.30377: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853673.30393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853673.30418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853673.30430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853673.30529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853673.32226: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853673.32288: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853673.32354: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpxd9yxso7 /root/.ansible/tmp/ansible-tmp-1726853673.1215818-30991-81093764798909/AnsiballZ_file.py <<< 29922 1726853673.32359: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853673.1215818-30991-81093764798909/AnsiballZ_file.py" <<< 29922 1726853673.32425: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpxd9yxso7" to remote "/root/.ansible/tmp/ansible-tmp-1726853673.1215818-30991-81093764798909/AnsiballZ_file.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853673.1215818-30991-81093764798909/AnsiballZ_file.py" <<< 29922 1726853673.33295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853673.33326: stderr chunk (state=3): >>><<< 29922 1726853673.33389: stdout chunk (state=3): >>><<< 29922 1726853673.33399: done transferring module to remote 29922 1726853673.33415: _low_level_execute_command(): starting 29922 1726853673.33424: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853673.1215818-30991-81093764798909/ /root/.ansible/tmp/ansible-tmp-1726853673.1215818-30991-81093764798909/AnsiballZ_file.py && sleep 0' 29922 1726853673.34167: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853673.34193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853673.34288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853673.34324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853673.34345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853673.34362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853673.34454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853673.36340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853673.36360: stdout chunk (state=3): >>><<< 29922 1726853673.36381: stderr chunk (state=3): >>><<< 29922 1726853673.36404: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853673.36478: _low_level_execute_command(): starting 29922 1726853673.36481: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853673.1215818-30991-81093764798909/AnsiballZ_file.py && sleep 0' 29922 1726853673.37093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853673.37108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853673.37189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853673.37193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853673.37248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853673.37280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853673.37302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853673.37402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853673.53837: stdout chunk (state=3): >>> {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 29922 1726853673.55459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853673.55463: stdout chunk (state=3): >>><<< 29922 1726853673.55465: stderr chunk (state=3): >>><<< 29922 1726853673.55494: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853673.55878: done with _execute_module (file, {'state': 'absent', 'path': '/etc/iproute2/rt_tables.d/table.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853673.1215818-30991-81093764798909/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853673.55882: _low_level_execute_command(): starting 29922 1726853673.55884: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853673.1215818-30991-81093764798909/ > /dev/null 2>&1 && sleep 0' 29922 1726853673.56808: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853673.56812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853673.57088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853673.57135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853673.57291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853673.59145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853673.59178: stderr chunk (state=3): >>><<< 29922 1726853673.59188: stdout chunk (state=3): >>><<< 29922 1726853673.59209: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853673.59221: handler run complete 29922 1726853673.59247: attempt loop complete, returning result 29922 1726853673.59254: _execute() done 29922 1726853673.59259: dumping result to json 29922 1726853673.59268: done dumping result, returning 29922 1726853673.59284: done running TaskExecutor() for managed_node3/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` [02083763-bbaf-51d4-513b-000000000068] 29922 1726853673.59291: sending task result for task 02083763-bbaf-51d4-513b-000000000068 changed: [managed_node3] => { "changed": true, "path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent" } 29922 1726853673.59468: no more pending results, returning what we have 29922 1726853673.59474: results queue empty 29922 1726853673.59475: checking for any_errors_fatal 29922 1726853673.59485: done checking for any_errors_fatal 29922 1726853673.59486: checking for max_fail_percentage 29922 1726853673.59488: done checking for max_fail_percentage 29922 1726853673.59489: checking to see if all hosts have failed and the running result is not ok 29922 1726853673.59490: done checking to see if all hosts have failed 29922 1726853673.59490: getting the remaining hosts for this loop 29922 1726853673.59492: done getting the remaining hosts for this loop 29922 1726853673.59496: getting the next task for host managed_node3 29922 1726853673.59503: done getting next task for host managed_node3 29922 1726853673.59505: ^ task is: TASK: meta (flush_handlers) 29922 1726853673.59507: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853673.59517: getting variables 29922 1726853673.59519: in VariableManager get_vars() 29922 1726853673.59563: Calling all_inventory to load vars for managed_node3 29922 1726853673.59566: Calling groups_inventory to load vars for managed_node3 29922 1726853673.59568: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853673.59795: done sending task result for task 02083763-bbaf-51d4-513b-000000000068 29922 1726853673.59799: WORKER PROCESS EXITING 29922 1726853673.59810: Calling all_plugins_play to load vars for managed_node3 29922 1726853673.59814: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853673.59818: Calling groups_plugins_play to load vars for managed_node3 29922 1726853673.61414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853673.62935: done with get_vars() 29922 1726853673.62960: done getting variables 29922 1726853673.63033: in VariableManager get_vars() 29922 1726853673.63046: Calling all_inventory to load vars for managed_node3 29922 1726853673.63048: Calling groups_inventory to load vars for managed_node3 29922 1726853673.63050: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853673.63055: Calling all_plugins_play to load vars for managed_node3 29922 1726853673.63057: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853673.63060: Calling groups_plugins_play to load vars for managed_node3 29922 1726853673.64252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853673.65773: done with get_vars() 29922 1726853673.65802: done queuing things up, now waiting for results queue to drain 29922 1726853673.65804: results queue empty 29922 1726853673.65805: checking for any_errors_fatal 29922 1726853673.65808: done checking for any_errors_fatal 29922 1726853673.65809: checking for max_fail_percentage 29922 1726853673.65810: done checking for max_fail_percentage 29922 1726853673.65811: checking to see if all hosts have failed and the running result is not ok 29922 1726853673.65812: done checking to see if all hosts have failed 29922 1726853673.65812: getting the remaining hosts for this loop 29922 1726853673.65813: done getting the remaining hosts for this loop 29922 1726853673.65816: getting the next task for host managed_node3 29922 1726853673.65820: done getting next task for host managed_node3 29922 1726853673.65822: ^ task is: TASK: meta (flush_handlers) 29922 1726853673.65823: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853673.65826: getting variables 29922 1726853673.65827: in VariableManager get_vars() 29922 1726853673.65840: Calling all_inventory to load vars for managed_node3 29922 1726853673.65843: Calling groups_inventory to load vars for managed_node3 29922 1726853673.65845: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853673.65849: Calling all_plugins_play to load vars for managed_node3 29922 1726853673.65852: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853673.65854: Calling groups_plugins_play to load vars for managed_node3 29922 1726853673.66965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853673.68511: done with get_vars() 29922 1726853673.68530: done getting variables 29922 1726853673.68582: in VariableManager get_vars() 29922 1726853673.68598: Calling all_inventory to load vars for managed_node3 29922 1726853673.68601: Calling groups_inventory to load vars for managed_node3 29922 1726853673.68603: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853673.68608: Calling all_plugins_play to load vars for managed_node3 29922 1726853673.68610: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853673.68612: Calling groups_plugins_play to load vars for managed_node3 29922 1726853673.69741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853673.71354: done with get_vars() 29922 1726853673.71383: done queuing things up, now waiting for results queue to drain 29922 1726853673.71386: results queue empty 29922 1726853673.71386: checking for any_errors_fatal 29922 1726853673.71388: done checking for any_errors_fatal 29922 1726853673.71388: checking for max_fail_percentage 29922 1726853673.71390: done checking for max_fail_percentage 29922 1726853673.71391: checking to see if all hosts have failed and the running result is not ok 29922 1726853673.71391: done checking to see if all hosts have failed 29922 1726853673.71392: getting the remaining hosts for this loop 29922 1726853673.71393: done getting the remaining hosts for this loop 29922 1726853673.71396: getting the next task for host managed_node3 29922 1726853673.71399: done getting next task for host managed_node3 29922 1726853673.71400: ^ task is: None 29922 1726853673.71401: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853673.71402: done queuing things up, now waiting for results queue to drain 29922 1726853673.71403: results queue empty 29922 1726853673.71404: checking for any_errors_fatal 29922 1726853673.71404: done checking for any_errors_fatal 29922 1726853673.71405: checking for max_fail_percentage 29922 1726853673.71406: done checking for max_fail_percentage 29922 1726853673.71407: checking to see if all hosts have failed and the running result is not ok 29922 1726853673.71407: done checking to see if all hosts have failed 29922 1726853673.71409: getting the next task for host managed_node3 29922 1726853673.71412: done getting next task for host managed_node3 29922 1726853673.71412: ^ task is: None 29922 1726853673.71413: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853673.71486: in VariableManager get_vars() 29922 1726853673.71507: done with get_vars() 29922 1726853673.71513: in VariableManager get_vars() 29922 1726853673.71526: done with get_vars() 29922 1726853673.71537: variable 'omit' from source: magic vars 29922 1726853673.71657: variable 'profile' from source: play vars 29922 1726853673.71780: in VariableManager get_vars() 29922 1726853673.71795: done with get_vars() 29922 1726853673.71817: variable 'omit' from source: magic vars 29922 1726853673.71889: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 29922 1726853673.72643: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 29922 1726853673.72667: getting the remaining hosts for this loop 29922 1726853673.72668: done getting the remaining hosts for this loop 29922 1726853673.72673: getting the next task for host managed_node3 29922 1726853673.72676: done getting next task for host managed_node3 29922 1726853673.72678: ^ task is: TASK: Gathering Facts 29922 1726853673.72680: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853673.72682: getting variables 29922 1726853673.72683: in VariableManager get_vars() 29922 1726853673.72694: Calling all_inventory to load vars for managed_node3 29922 1726853673.72696: Calling groups_inventory to load vars for managed_node3 29922 1726853673.72698: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853673.72704: Calling all_plugins_play to load vars for managed_node3 29922 1726853673.72706: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853673.72709: Calling groups_plugins_play to load vars for managed_node3 29922 1726853673.74057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853673.75678: done with get_vars() 29922 1726853673.75699: done getting variables 29922 1726853673.75743: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 13:34:33 -0400 (0:00:00.678) 0:00:22.687 ****** 29922 1726853673.75768: entering _queue_task() for managed_node3/gather_facts 29922 1726853673.76141: worker is 1 (out of 1 available) 29922 1726853673.76151: exiting _queue_task() for managed_node3/gather_facts 29922 1726853673.76162: done queuing things up, now waiting for results queue to drain 29922 1726853673.76164: waiting for pending results... 29922 1726853673.76551: running TaskExecutor() for managed_node3/TASK: Gathering Facts 29922 1726853673.76555: in run() - task 02083763-bbaf-51d4-513b-0000000004b1 29922 1726853673.76578: variable 'ansible_search_path' from source: unknown 29922 1726853673.76620: calling self._execute() 29922 1726853673.76733: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853673.76746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853673.76761: variable 'omit' from source: magic vars 29922 1726853673.77166: variable 'ansible_distribution_major_version' from source: facts 29922 1726853673.77186: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853673.77196: variable 'omit' from source: magic vars 29922 1726853673.77235: variable 'omit' from source: magic vars 29922 1726853673.77325: variable 'omit' from source: magic vars 29922 1726853673.77329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853673.77369: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853673.77397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853673.77420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853673.77444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853673.77482: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853673.77543: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853673.77546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853673.77610: Set connection var ansible_connection to ssh 29922 1726853673.77625: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853673.77637: Set connection var ansible_shell_executable to /bin/sh 29922 1726853673.77657: Set connection var ansible_pipelining to False 29922 1726853673.77668: Set connection var ansible_timeout to 10 29922 1726853673.77678: Set connection var ansible_shell_type to sh 29922 1726853673.77706: variable 'ansible_shell_executable' from source: unknown 29922 1726853673.77713: variable 'ansible_connection' from source: unknown 29922 1726853673.77720: variable 'ansible_module_compression' from source: unknown 29922 1726853673.77761: variable 'ansible_shell_type' from source: unknown 29922 1726853673.77764: variable 'ansible_shell_executable' from source: unknown 29922 1726853673.77766: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853673.77768: variable 'ansible_pipelining' from source: unknown 29922 1726853673.77772: variable 'ansible_timeout' from source: unknown 29922 1726853673.77774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853673.77949: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853673.77967: variable 'omit' from source: magic vars 29922 1726853673.78077: starting attempt loop 29922 1726853673.78082: running the handler 29922 1726853673.78084: variable 'ansible_facts' from source: unknown 29922 1726853673.78087: _low_level_execute_command(): starting 29922 1726853673.78089: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853673.78875: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853673.78880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853673.78929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853673.78945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853673.78982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853673.79082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853673.80813: stdout chunk (state=3): >>>/root <<< 29922 1726853673.80978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853673.80981: stdout chunk (state=3): >>><<< 29922 1726853673.80984: stderr chunk (state=3): >>><<< 29922 1726853673.81106: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853673.81109: _low_level_execute_command(): starting 29922 1726853673.81112: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853673.8101087-31022-96341146187552 `" && echo ansible-tmp-1726853673.8101087-31022-96341146187552="` echo /root/.ansible/tmp/ansible-tmp-1726853673.8101087-31022-96341146187552 `" ) && sleep 0' 29922 1726853673.81663: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853673.81680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853673.81695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853673.81725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853673.81741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853673.81753: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853673.81782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853673.81829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853673.81894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853673.81953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853673.82014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853673.83984: stdout chunk (state=3): >>>ansible-tmp-1726853673.8101087-31022-96341146187552=/root/.ansible/tmp/ansible-tmp-1726853673.8101087-31022-96341146187552 <<< 29922 1726853673.84152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853673.84159: stdout chunk (state=3): >>><<< 29922 1726853673.84162: stderr chunk (state=3): >>><<< 29922 1726853673.84258: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853673.8101087-31022-96341146187552=/root/.ansible/tmp/ansible-tmp-1726853673.8101087-31022-96341146187552 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853673.84262: variable 'ansible_module_compression' from source: unknown 29922 1726853673.84288: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 29922 1726853673.84361: variable 'ansible_facts' from source: unknown 29922 1726853673.84553: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853673.8101087-31022-96341146187552/AnsiballZ_setup.py 29922 1726853673.84728: Sending initial data 29922 1726853673.84737: Sent initial data (153 bytes) 29922 1726853673.85393: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853673.85498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853673.85511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853673.85617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853673.87234: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 29922 1726853673.87249: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853673.87296: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853673.87363: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp7ivvldb1 /root/.ansible/tmp/ansible-tmp-1726853673.8101087-31022-96341146187552/AnsiballZ_setup.py <<< 29922 1726853673.87366: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853673.8101087-31022-96341146187552/AnsiballZ_setup.py" <<< 29922 1726853673.87413: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp7ivvldb1" to remote "/root/.ansible/tmp/ansible-tmp-1726853673.8101087-31022-96341146187552/AnsiballZ_setup.py" <<< 29922 1726853673.87419: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853673.8101087-31022-96341146187552/AnsiballZ_setup.py" <<< 29922 1726853673.88780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853673.88783: stderr chunk (state=3): >>><<< 29922 1726853673.88786: stdout chunk (state=3): >>><<< 29922 1726853673.88788: done transferring module to remote 29922 1726853673.88790: _low_level_execute_command(): starting 29922 1726853673.88792: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853673.8101087-31022-96341146187552/ /root/.ansible/tmp/ansible-tmp-1726853673.8101087-31022-96341146187552/AnsiballZ_setup.py && sleep 0' 29922 1726853673.89372: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853673.89398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853673.89419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853673.89422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853673.89424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853673.89426: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853673.89498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853673.89529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853673.89582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853673.91420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853673.91442: stderr chunk (state=3): >>><<< 29922 1726853673.91445: stdout chunk (state=3): >>><<< 29922 1726853673.91460: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853673.91463: _low_level_execute_command(): starting 29922 1726853673.91468: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853673.8101087-31022-96341146187552/AnsiballZ_setup.py && sleep 0' 29922 1726853673.91890: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853673.91893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29922 1726853673.91897: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853673.91899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853673.91941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853673.91945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853673.92015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853674.59650: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.55029296875, "5m": 0.501953125, "15m": 0.3056640625}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2979, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 552, "free": 2979}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 818, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261798653952, "block_size": 4096, "block_total": 65519099, "block_available": 63915687, "block_used": 1603412, "inode_total": 131070960, "inode_available": 131029145, "inode_used": 41815, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "34", "epoch": "1726853674", "epoch_int": "1726853674", "date": "2024-09-20", "time": "13:34:34", "iso8601_micro": "2024-09-20T17:34:34.516705Z", "iso8601": "2024-09-20T17:34:34Z", "iso8601_basic": "20240920T133434516705", "iso8601_basic_short": "20240920T133434", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_interfaces": ["ethtest0", "peerethtest0", "eth0", "lo", "rpltstbr"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off"<<< 29922 1726853674.59666: stdout chunk (state=3): >>>, "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "ae:20:01:6f:4b:76", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "2001:db8::2", "prefix": "32", "scope": "global"}, {"address": "fe80::ac20:1ff:fe6f:4b76", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "62:d7:bc:c2:71:2d", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::60d7:bcff:fec2:712d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72", "198.51.100.3"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9", "2001:db8::2", "fe80::ac20:1ff:fe6f:4b76", "fe80::60d7:bcff:fec2:712d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72", "198.51.100.3"], "ipv6": ["::1", "2001:db8::2", "fe80::102a:53ff:fe36:f0e9", "fe80::60d7:bcff:fec2:712d", "fe80::ac20:1ff:fe6f:4b76"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 29922 1726853674.61479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853674.61483: stdout chunk (state=3): >>><<< 29922 1726853674.61486: stderr chunk (state=3): >>><<< 29922 1726853674.61579: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.55029296875, "5m": 0.501953125, "15m": 0.3056640625}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2979, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 552, "free": 2979}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 818, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261798653952, "block_size": 4096, "block_total": 65519099, "block_available": 63915687, "block_used": 1603412, "inode_total": 131070960, "inode_available": 131029145, "inode_used": 41815, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "34", "epoch": "1726853674", "epoch_int": "1726853674", "date": "2024-09-20", "time": "13:34:34", "iso8601_micro": "2024-09-20T17:34:34.516705Z", "iso8601": "2024-09-20T17:34:34Z", "iso8601_basic": "20240920T133434516705", "iso8601_basic_short": "20240920T133434", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_interfaces": ["ethtest0", "peerethtest0", "eth0", "lo", "rpltstbr"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "ae:20:01:6f:4b:76", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "2001:db8::2", "prefix": "32", "scope": "global"}, {"address": "fe80::ac20:1ff:fe6f:4b76", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "62:d7:bc:c2:71:2d", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::60d7:bcff:fec2:712d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72", "198.51.100.3"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9", "2001:db8::2", "fe80::ac20:1ff:fe6f:4b76", "fe80::60d7:bcff:fec2:712d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72", "198.51.100.3"], "ipv6": ["::1", "2001:db8::2", "fe80::102a:53ff:fe36:f0e9", "fe80::60d7:bcff:fec2:712d", "fe80::ac20:1ff:fe6f:4b76"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853674.62712: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853673.8101087-31022-96341146187552/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853674.62731: _low_level_execute_command(): starting 29922 1726853674.62793: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853673.8101087-31022-96341146187552/ > /dev/null 2>&1 && sleep 0' 29922 1726853674.63946: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853674.63994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853674.64150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853674.64357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853674.64412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853674.66381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853674.66385: stdout chunk (state=3): >>><<< 29922 1726853674.66387: stderr chunk (state=3): >>><<< 29922 1726853674.66424: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853674.66678: handler run complete 29922 1726853674.66894: variable 'ansible_facts' from source: unknown 29922 1726853674.67044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853674.68168: variable 'ansible_facts' from source: unknown 29922 1726853674.68175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853674.68565: attempt loop complete, returning result 29922 1726853674.68609: _execute() done 29922 1726853674.68617: dumping result to json 29922 1726853674.68665: done dumping result, returning 29922 1726853674.68720: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [02083763-bbaf-51d4-513b-0000000004b1] 29922 1726853674.69078: sending task result for task 02083763-bbaf-51d4-513b-0000000004b1 29922 1726853674.69905: done sending task result for task 02083763-bbaf-51d4-513b-0000000004b1 29922 1726853674.69908: WORKER PROCESS EXITING ok: [managed_node3] 29922 1726853674.70269: no more pending results, returning what we have 29922 1726853674.70676: results queue empty 29922 1726853674.70677: checking for any_errors_fatal 29922 1726853674.70679: done checking for any_errors_fatal 29922 1726853674.70680: checking for max_fail_percentage 29922 1726853674.70681: done checking for max_fail_percentage 29922 1726853674.70682: checking to see if all hosts have failed and the running result is not ok 29922 1726853674.70682: done checking to see if all hosts have failed 29922 1726853674.70683: getting the remaining hosts for this loop 29922 1726853674.70684: done getting the remaining hosts for this loop 29922 1726853674.70688: getting the next task for host managed_node3 29922 1726853674.70693: done getting next task for host managed_node3 29922 1726853674.70695: ^ task is: TASK: meta (flush_handlers) 29922 1726853674.70697: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853674.70701: getting variables 29922 1726853674.70702: in VariableManager get_vars() 29922 1726853674.70731: Calling all_inventory to load vars for managed_node3 29922 1726853674.70734: Calling groups_inventory to load vars for managed_node3 29922 1726853674.70736: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853674.70746: Calling all_plugins_play to load vars for managed_node3 29922 1726853674.70749: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853674.70752: Calling groups_plugins_play to load vars for managed_node3 29922 1726853674.73948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853674.75866: done with get_vars() 29922 1726853674.75902: done getting variables 29922 1726853674.75975: in VariableManager get_vars() 29922 1726853674.75993: Calling all_inventory to load vars for managed_node3 29922 1726853674.75995: Calling groups_inventory to load vars for managed_node3 29922 1726853674.75997: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853674.76002: Calling all_plugins_play to load vars for managed_node3 29922 1726853674.76005: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853674.76007: Calling groups_plugins_play to load vars for managed_node3 29922 1726853674.77217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853674.79672: done with get_vars() 29922 1726853674.79716: done queuing things up, now waiting for results queue to drain 29922 1726853674.79719: results queue empty 29922 1726853674.79720: checking for any_errors_fatal 29922 1726853674.79725: done checking for any_errors_fatal 29922 1726853674.79726: checking for max_fail_percentage 29922 1726853674.79727: done checking for max_fail_percentage 29922 1726853674.79728: checking to see if all hosts have failed and the running result is not ok 29922 1726853674.79728: done checking to see if all hosts have failed 29922 1726853674.79733: getting the remaining hosts for this loop 29922 1726853674.79735: done getting the remaining hosts for this loop 29922 1726853674.79738: getting the next task for host managed_node3 29922 1726853674.79742: done getting next task for host managed_node3 29922 1726853674.79745: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 29922 1726853674.79747: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853674.79758: getting variables 29922 1726853674.79759: in VariableManager get_vars() 29922 1726853674.79777: Calling all_inventory to load vars for managed_node3 29922 1726853674.79780: Calling groups_inventory to load vars for managed_node3 29922 1726853674.79782: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853674.79788: Calling all_plugins_play to load vars for managed_node3 29922 1726853674.79790: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853674.79793: Calling groups_plugins_play to load vars for managed_node3 29922 1726853674.81059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853674.84121: done with get_vars() 29922 1726853674.84154: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:34:34 -0400 (0:00:01.084) 0:00:23.771 ****** 29922 1726853674.84237: entering _queue_task() for managed_node3/include_tasks 29922 1726853674.84709: worker is 1 (out of 1 available) 29922 1726853674.84720: exiting _queue_task() for managed_node3/include_tasks 29922 1726853674.84732: done queuing things up, now waiting for results queue to drain 29922 1726853674.84733: waiting for pending results... 29922 1726853674.84916: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 29922 1726853674.85068: in run() - task 02083763-bbaf-51d4-513b-000000000071 29922 1726853674.85074: variable 'ansible_search_path' from source: unknown 29922 1726853674.85078: variable 'ansible_search_path' from source: unknown 29922 1726853674.85104: calling self._execute() 29922 1726853674.85252: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853674.85255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853674.85258: variable 'omit' from source: magic vars 29922 1726853674.85643: variable 'ansible_distribution_major_version' from source: facts 29922 1726853674.85659: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853674.85673: _execute() done 29922 1726853674.85685: dumping result to json 29922 1726853674.85692: done dumping result, returning 29922 1726853674.85718: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-51d4-513b-000000000071] 29922 1726853674.85721: sending task result for task 02083763-bbaf-51d4-513b-000000000071 29922 1726853674.85921: done sending task result for task 02083763-bbaf-51d4-513b-000000000071 29922 1726853674.85924: WORKER PROCESS EXITING 29922 1726853674.85973: no more pending results, returning what we have 29922 1726853674.85978: in VariableManager get_vars() 29922 1726853674.86186: Calling all_inventory to load vars for managed_node3 29922 1726853674.86189: Calling groups_inventory to load vars for managed_node3 29922 1726853674.86191: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853674.86200: Calling all_plugins_play to load vars for managed_node3 29922 1726853674.86202: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853674.86205: Calling groups_plugins_play to load vars for managed_node3 29922 1726853674.87611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853674.89983: done with get_vars() 29922 1726853674.90011: variable 'ansible_search_path' from source: unknown 29922 1726853674.90013: variable 'ansible_search_path' from source: unknown 29922 1726853674.90044: we have included files to process 29922 1726853674.90045: generating all_blocks data 29922 1726853674.90047: done generating all_blocks data 29922 1726853674.90047: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29922 1726853674.90049: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29922 1726853674.90051: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29922 1726853674.90619: done processing included file 29922 1726853674.90621: iterating over new_blocks loaded from include file 29922 1726853674.90623: in VariableManager get_vars() 29922 1726853674.90641: done with get_vars() 29922 1726853674.90643: filtering new block on tags 29922 1726853674.90657: done filtering new block on tags 29922 1726853674.90659: in VariableManager get_vars() 29922 1726853674.90677: done with get_vars() 29922 1726853674.90679: filtering new block on tags 29922 1726853674.90694: done filtering new block on tags 29922 1726853674.90696: in VariableManager get_vars() 29922 1726853674.90721: done with get_vars() 29922 1726853674.90723: filtering new block on tags 29922 1726853674.90741: done filtering new block on tags 29922 1726853674.90743: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 29922 1726853674.90749: extending task lists for all hosts with included blocks 29922 1726853674.91141: done extending task lists 29922 1726853674.91142: done processing included files 29922 1726853674.91143: results queue empty 29922 1726853674.91144: checking for any_errors_fatal 29922 1726853674.91145: done checking for any_errors_fatal 29922 1726853674.91146: checking for max_fail_percentage 29922 1726853674.91147: done checking for max_fail_percentage 29922 1726853674.91148: checking to see if all hosts have failed and the running result is not ok 29922 1726853674.91149: done checking to see if all hosts have failed 29922 1726853674.91149: getting the remaining hosts for this loop 29922 1726853674.91155: done getting the remaining hosts for this loop 29922 1726853674.91158: getting the next task for host managed_node3 29922 1726853674.91162: done getting next task for host managed_node3 29922 1726853674.91164: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 29922 1726853674.91167: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853674.91178: getting variables 29922 1726853674.91179: in VariableManager get_vars() 29922 1726853674.91193: Calling all_inventory to load vars for managed_node3 29922 1726853674.91195: Calling groups_inventory to load vars for managed_node3 29922 1726853674.91197: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853674.91203: Calling all_plugins_play to load vars for managed_node3 29922 1726853674.91205: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853674.91208: Calling groups_plugins_play to load vars for managed_node3 29922 1726853674.97698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853674.99313: done with get_vars() 29922 1726853674.99339: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:34:34 -0400 (0:00:00.151) 0:00:23.923 ****** 29922 1726853674.99421: entering _queue_task() for managed_node3/setup 29922 1726853674.99781: worker is 1 (out of 1 available) 29922 1726853674.99795: exiting _queue_task() for managed_node3/setup 29922 1726853674.99811: done queuing things up, now waiting for results queue to drain 29922 1726853674.99812: waiting for pending results... 29922 1726853675.00029: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 29922 1726853675.00156: in run() - task 02083763-bbaf-51d4-513b-0000000004f2 29922 1726853675.00175: variable 'ansible_search_path' from source: unknown 29922 1726853675.00178: variable 'ansible_search_path' from source: unknown 29922 1726853675.00229: calling self._execute() 29922 1726853675.00302: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853675.00336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853675.00344: variable 'omit' from source: magic vars 29922 1726853675.00681: variable 'ansible_distribution_major_version' from source: facts 29922 1726853675.00693: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853675.00901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853675.03175: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853675.03230: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853675.03281: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853675.03476: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853675.03480: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853675.03483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853675.03486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853675.03498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853675.03544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853675.03566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853675.03636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853675.03666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853675.03697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853675.03744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853675.03760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853675.03914: variable '__network_required_facts' from source: role '' defaults 29922 1726853675.03936: variable 'ansible_facts' from source: unknown 29922 1726853675.04822: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 29922 1726853675.04876: when evaluation is False, skipping this task 29922 1726853675.04879: _execute() done 29922 1726853675.04881: dumping result to json 29922 1726853675.04883: done dumping result, returning 29922 1726853675.04886: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-51d4-513b-0000000004f2] 29922 1726853675.04888: sending task result for task 02083763-bbaf-51d4-513b-0000000004f2 29922 1726853675.05177: done sending task result for task 02083763-bbaf-51d4-513b-0000000004f2 29922 1726853675.05181: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29922 1726853675.05223: no more pending results, returning what we have 29922 1726853675.05226: results queue empty 29922 1726853675.05227: checking for any_errors_fatal 29922 1726853675.05229: done checking for any_errors_fatal 29922 1726853675.05229: checking for max_fail_percentage 29922 1726853675.05231: done checking for max_fail_percentage 29922 1726853675.05232: checking to see if all hosts have failed and the running result is not ok 29922 1726853675.05233: done checking to see if all hosts have failed 29922 1726853675.05233: getting the remaining hosts for this loop 29922 1726853675.05235: done getting the remaining hosts for this loop 29922 1726853675.05238: getting the next task for host managed_node3 29922 1726853675.05246: done getting next task for host managed_node3 29922 1726853675.05251: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 29922 1726853675.05254: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853675.05268: getting variables 29922 1726853675.05269: in VariableManager get_vars() 29922 1726853675.05310: Calling all_inventory to load vars for managed_node3 29922 1726853675.05313: Calling groups_inventory to load vars for managed_node3 29922 1726853675.05315: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853675.05326: Calling all_plugins_play to load vars for managed_node3 29922 1726853675.05329: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853675.05332: Calling groups_plugins_play to load vars for managed_node3 29922 1726853675.06842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853675.08648: done with get_vars() 29922 1726853675.08669: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:34:35 -0400 (0:00:00.093) 0:00:24.017 ****** 29922 1726853675.08769: entering _queue_task() for managed_node3/stat 29922 1726853675.09107: worker is 1 (out of 1 available) 29922 1726853675.09119: exiting _queue_task() for managed_node3/stat 29922 1726853675.09130: done queuing things up, now waiting for results queue to drain 29922 1726853675.09131: waiting for pending results... 29922 1726853675.09436: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 29922 1726853675.09598: in run() - task 02083763-bbaf-51d4-513b-0000000004f4 29922 1726853675.09619: variable 'ansible_search_path' from source: unknown 29922 1726853675.09706: variable 'ansible_search_path' from source: unknown 29922 1726853675.09710: calling self._execute() 29922 1726853675.09778: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853675.09792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853675.09815: variable 'omit' from source: magic vars 29922 1726853675.10215: variable 'ansible_distribution_major_version' from source: facts 29922 1726853675.10233: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853675.10409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853675.10692: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853675.10741: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853675.10826: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853675.10864: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853675.10959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853675.11005: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853675.11028: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853675.11115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853675.11158: variable '__network_is_ostree' from source: set_fact 29922 1726853675.11172: Evaluated conditional (not __network_is_ostree is defined): False 29922 1726853675.11181: when evaluation is False, skipping this task 29922 1726853675.11188: _execute() done 29922 1726853675.11195: dumping result to json 29922 1726853675.11202: done dumping result, returning 29922 1726853675.11218: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-51d4-513b-0000000004f4] 29922 1726853675.11232: sending task result for task 02083763-bbaf-51d4-513b-0000000004f4 29922 1726853675.11486: done sending task result for task 02083763-bbaf-51d4-513b-0000000004f4 29922 1726853675.11490: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 29922 1726853675.11541: no more pending results, returning what we have 29922 1726853675.11546: results queue empty 29922 1726853675.11546: checking for any_errors_fatal 29922 1726853675.11552: done checking for any_errors_fatal 29922 1726853675.11553: checking for max_fail_percentage 29922 1726853675.11554: done checking for max_fail_percentage 29922 1726853675.11555: checking to see if all hosts have failed and the running result is not ok 29922 1726853675.11556: done checking to see if all hosts have failed 29922 1726853675.11557: getting the remaining hosts for this loop 29922 1726853675.11558: done getting the remaining hosts for this loop 29922 1726853675.11562: getting the next task for host managed_node3 29922 1726853675.11569: done getting next task for host managed_node3 29922 1726853675.11575: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 29922 1726853675.11578: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853675.11598: getting variables 29922 1726853675.11600: in VariableManager get_vars() 29922 1726853675.11639: Calling all_inventory to load vars for managed_node3 29922 1726853675.11641: Calling groups_inventory to load vars for managed_node3 29922 1726853675.11644: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853675.11655: Calling all_plugins_play to load vars for managed_node3 29922 1726853675.11659: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853675.11662: Calling groups_plugins_play to load vars for managed_node3 29922 1726853675.13154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853675.14761: done with get_vars() 29922 1726853675.14785: done getting variables 29922 1726853675.14846: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:34:35 -0400 (0:00:00.061) 0:00:24.078 ****** 29922 1726853675.14884: entering _queue_task() for managed_node3/set_fact 29922 1726853675.15233: worker is 1 (out of 1 available) 29922 1726853675.15246: exiting _queue_task() for managed_node3/set_fact 29922 1726853675.15257: done queuing things up, now waiting for results queue to drain 29922 1726853675.15258: waiting for pending results... 29922 1726853675.15539: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 29922 1726853675.15681: in run() - task 02083763-bbaf-51d4-513b-0000000004f5 29922 1726853675.15806: variable 'ansible_search_path' from source: unknown 29922 1726853675.15809: variable 'ansible_search_path' from source: unknown 29922 1726853675.15812: calling self._execute() 29922 1726853675.15851: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853675.15863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853675.15879: variable 'omit' from source: magic vars 29922 1726853675.16258: variable 'ansible_distribution_major_version' from source: facts 29922 1726853675.16276: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853675.16439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853675.16729: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853675.16776: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853675.16823: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853675.16905: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853675.16993: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853675.17028: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853675.17060: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853675.17093: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853675.17185: variable '__network_is_ostree' from source: set_fact 29922 1726853675.17221: Evaluated conditional (not __network_is_ostree is defined): False 29922 1726853675.17225: when evaluation is False, skipping this task 29922 1726853675.17227: _execute() done 29922 1726853675.17229: dumping result to json 29922 1726853675.17231: done dumping result, returning 29922 1726853675.17233: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-51d4-513b-0000000004f5] 29922 1726853675.17240: sending task result for task 02083763-bbaf-51d4-513b-0000000004f5 29922 1726853675.17394: done sending task result for task 02083763-bbaf-51d4-513b-0000000004f5 29922 1726853675.17397: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 29922 1726853675.17482: no more pending results, returning what we have 29922 1726853675.17486: results queue empty 29922 1726853675.17487: checking for any_errors_fatal 29922 1726853675.17494: done checking for any_errors_fatal 29922 1726853675.17495: checking for max_fail_percentage 29922 1726853675.17497: done checking for max_fail_percentage 29922 1726853675.17498: checking to see if all hosts have failed and the running result is not ok 29922 1726853675.17499: done checking to see if all hosts have failed 29922 1726853675.17499: getting the remaining hosts for this loop 29922 1726853675.17501: done getting the remaining hosts for this loop 29922 1726853675.17505: getting the next task for host managed_node3 29922 1726853675.17515: done getting next task for host managed_node3 29922 1726853675.17519: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 29922 1726853675.17522: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853675.17537: getting variables 29922 1726853675.17538: in VariableManager get_vars() 29922 1726853675.17689: Calling all_inventory to load vars for managed_node3 29922 1726853675.17692: Calling groups_inventory to load vars for managed_node3 29922 1726853675.17695: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853675.17704: Calling all_plugins_play to load vars for managed_node3 29922 1726853675.17707: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853675.17710: Calling groups_plugins_play to load vars for managed_node3 29922 1726853675.19315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853675.20919: done with get_vars() 29922 1726853675.20943: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:34:35 -0400 (0:00:00.061) 0:00:24.140 ****** 29922 1726853675.21045: entering _queue_task() for managed_node3/service_facts 29922 1726853675.21601: worker is 1 (out of 1 available) 29922 1726853675.21612: exiting _queue_task() for managed_node3/service_facts 29922 1726853675.21623: done queuing things up, now waiting for results queue to drain 29922 1726853675.21624: waiting for pending results... 29922 1726853675.21755: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 29922 1726853675.21857: in run() - task 02083763-bbaf-51d4-513b-0000000004f7 29922 1726853675.21882: variable 'ansible_search_path' from source: unknown 29922 1726853675.21889: variable 'ansible_search_path' from source: unknown 29922 1726853675.21961: calling self._execute() 29922 1726853675.22029: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853675.22043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853675.22059: variable 'omit' from source: magic vars 29922 1726853675.22454: variable 'ansible_distribution_major_version' from source: facts 29922 1726853675.22504: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853675.22507: variable 'omit' from source: magic vars 29922 1726853675.22544: variable 'omit' from source: magic vars 29922 1726853675.22583: variable 'omit' from source: magic vars 29922 1726853675.22629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853675.22667: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853675.22720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853675.22724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853675.22729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853675.22758: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853675.22765: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853675.22773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853675.22868: Set connection var ansible_connection to ssh 29922 1726853675.22937: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853675.22940: Set connection var ansible_shell_executable to /bin/sh 29922 1726853675.22942: Set connection var ansible_pipelining to False 29922 1726853675.22944: Set connection var ansible_timeout to 10 29922 1726853675.22945: Set connection var ansible_shell_type to sh 29922 1726853675.22947: variable 'ansible_shell_executable' from source: unknown 29922 1726853675.22949: variable 'ansible_connection' from source: unknown 29922 1726853675.22951: variable 'ansible_module_compression' from source: unknown 29922 1726853675.22952: variable 'ansible_shell_type' from source: unknown 29922 1726853675.22954: variable 'ansible_shell_executable' from source: unknown 29922 1726853675.22956: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853675.22958: variable 'ansible_pipelining' from source: unknown 29922 1726853675.22963: variable 'ansible_timeout' from source: unknown 29922 1726853675.22969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853675.23174: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853675.23192: variable 'omit' from source: magic vars 29922 1726853675.23202: starting attempt loop 29922 1726853675.23264: running the handler 29922 1726853675.23268: _low_level_execute_command(): starting 29922 1726853675.23273: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853675.23975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853675.23990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853675.24006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853675.24120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853675.24150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853675.24260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853675.25955: stdout chunk (state=3): >>>/root <<< 29922 1726853675.26112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853675.26116: stdout chunk (state=3): >>><<< 29922 1726853675.26118: stderr chunk (state=3): >>><<< 29922 1726853675.26240: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853675.26244: _low_level_execute_command(): starting 29922 1726853675.26247: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853675.2615073-31075-234430737595304 `" && echo ansible-tmp-1726853675.2615073-31075-234430737595304="` echo /root/.ansible/tmp/ansible-tmp-1726853675.2615073-31075-234430737595304 `" ) && sleep 0' 29922 1726853675.26837: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853675.26852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853675.26869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853675.26890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853675.26910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853675.26931: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853675.27048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853675.27065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853675.27095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853675.27199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853675.29166: stdout chunk (state=3): >>>ansible-tmp-1726853675.2615073-31075-234430737595304=/root/.ansible/tmp/ansible-tmp-1726853675.2615073-31075-234430737595304 <<< 29922 1726853675.29311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853675.29321: stdout chunk (state=3): >>><<< 29922 1726853675.29335: stderr chunk (state=3): >>><<< 29922 1726853675.29355: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853675.2615073-31075-234430737595304=/root/.ansible/tmp/ansible-tmp-1726853675.2615073-31075-234430737595304 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853675.29408: variable 'ansible_module_compression' from source: unknown 29922 1726853675.29483: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 29922 1726853675.29514: variable 'ansible_facts' from source: unknown 29922 1726853675.29678: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853675.2615073-31075-234430737595304/AnsiballZ_service_facts.py 29922 1726853675.29803: Sending initial data 29922 1726853675.29815: Sent initial data (162 bytes) 29922 1726853675.30386: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853675.30400: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853675.30448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853675.30522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853675.30551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853675.30577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853675.30673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853675.32306: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853675.32387: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853675.32480: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpjoylwgoy /root/.ansible/tmp/ansible-tmp-1726853675.2615073-31075-234430737595304/AnsiballZ_service_facts.py <<< 29922 1726853675.32483: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853675.2615073-31075-234430737595304/AnsiballZ_service_facts.py" <<< 29922 1726853675.32534: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpjoylwgoy" to remote "/root/.ansible/tmp/ansible-tmp-1726853675.2615073-31075-234430737595304/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853675.2615073-31075-234430737595304/AnsiballZ_service_facts.py" <<< 29922 1726853675.33433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853675.33436: stdout chunk (state=3): >>><<< 29922 1726853675.33541: stderr chunk (state=3): >>><<< 29922 1726853675.33547: done transferring module to remote 29922 1726853675.33565: _low_level_execute_command(): starting 29922 1726853675.33580: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853675.2615073-31075-234430737595304/ /root/.ansible/tmp/ansible-tmp-1726853675.2615073-31075-234430737595304/AnsiballZ_service_facts.py && sleep 0' 29922 1726853675.34179: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853675.34198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853675.34213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853675.34233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853675.34289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853675.34349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853675.34381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853675.34465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853675.36476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853675.36480: stdout chunk (state=3): >>><<< 29922 1726853675.36482: stderr chunk (state=3): >>><<< 29922 1726853675.36485: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853675.36488: _low_level_execute_command(): starting 29922 1726853675.36490: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853675.2615073-31075-234430737595304/AnsiballZ_service_facts.py && sleep 0' 29922 1726853675.37076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853675.37079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853675.37081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853675.37093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853675.37157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853675.37184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853675.37222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853675.37305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853676.97279: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 29922 1726853676.97341: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-<<< 29922 1726853676.97370: stdout chunk (state=3): >>>boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 29922 1726853676.98949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853676.98953: stdout chunk (state=3): >>><<< 29922 1726853676.99176: stderr chunk (state=3): >>><<< 29922 1726853676.99181: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853676.99784: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853675.2615073-31075-234430737595304/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853676.99798: _low_level_execute_command(): starting 29922 1726853676.99803: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853675.2615073-31075-234430737595304/ > /dev/null 2>&1 && sleep 0' 29922 1726853677.00549: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853677.00578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853677.00674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853677.02678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853677.02721: stderr chunk (state=3): >>><<< 29922 1726853677.02737: stdout chunk (state=3): >>><<< 29922 1726853677.02877: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853677.02880: handler run complete 29922 1726853677.02964: variable 'ansible_facts' from source: unknown 29922 1726853677.03118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853677.03611: variable 'ansible_facts' from source: unknown 29922 1726853677.03745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853677.03950: attempt loop complete, returning result 29922 1726853677.03961: _execute() done 29922 1726853677.03968: dumping result to json 29922 1726853677.04035: done dumping result, returning 29922 1726853677.04048: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-51d4-513b-0000000004f7] 29922 1726853677.04056: sending task result for task 02083763-bbaf-51d4-513b-0000000004f7 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29922 1726853677.05050: no more pending results, returning what we have 29922 1726853677.05053: results queue empty 29922 1726853677.05054: checking for any_errors_fatal 29922 1726853677.05059: done checking for any_errors_fatal 29922 1726853677.05060: checking for max_fail_percentage 29922 1726853677.05061: done checking for max_fail_percentage 29922 1726853677.05062: checking to see if all hosts have failed and the running result is not ok 29922 1726853677.05063: done checking to see if all hosts have failed 29922 1726853677.05063: getting the remaining hosts for this loop 29922 1726853677.05065: done getting the remaining hosts for this loop 29922 1726853677.05068: getting the next task for host managed_node3 29922 1726853677.05075: done getting next task for host managed_node3 29922 1726853677.05079: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 29922 1726853677.05082: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853677.05093: getting variables 29922 1726853677.05094: in VariableManager get_vars() 29922 1726853677.05129: Calling all_inventory to load vars for managed_node3 29922 1726853677.05132: Calling groups_inventory to load vars for managed_node3 29922 1726853677.05135: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853677.05145: Calling all_plugins_play to load vars for managed_node3 29922 1726853677.05147: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853677.05150: Calling groups_plugins_play to load vars for managed_node3 29922 1726853677.05685: done sending task result for task 02083763-bbaf-51d4-513b-0000000004f7 29922 1726853677.05688: WORKER PROCESS EXITING 29922 1726853677.06815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853677.08252: done with get_vars() 29922 1726853677.08282: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:34:37 -0400 (0:00:01.873) 0:00:26.013 ****** 29922 1726853677.08378: entering _queue_task() for managed_node3/package_facts 29922 1726853677.08732: worker is 1 (out of 1 available) 29922 1726853677.08745: exiting _queue_task() for managed_node3/package_facts 29922 1726853677.08758: done queuing things up, now waiting for results queue to drain 29922 1726853677.08759: waiting for pending results... 29922 1726853677.09048: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 29922 1726853677.09202: in run() - task 02083763-bbaf-51d4-513b-0000000004f8 29922 1726853677.09222: variable 'ansible_search_path' from source: unknown 29922 1726853677.09230: variable 'ansible_search_path' from source: unknown 29922 1726853677.09269: calling self._execute() 29922 1726853677.09382: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853677.09397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853677.09420: variable 'omit' from source: magic vars 29922 1726853677.09812: variable 'ansible_distribution_major_version' from source: facts 29922 1726853677.09831: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853677.09847: variable 'omit' from source: magic vars 29922 1726853677.09919: variable 'omit' from source: magic vars 29922 1726853677.09964: variable 'omit' from source: magic vars 29922 1726853677.10015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853677.10059: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853677.10178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853677.10182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853677.10185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853677.10188: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853677.10190: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853677.10192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853677.10275: Set connection var ansible_connection to ssh 29922 1726853677.10292: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853677.10306: Set connection var ansible_shell_executable to /bin/sh 29922 1726853677.10317: Set connection var ansible_pipelining to False 29922 1726853677.10326: Set connection var ansible_timeout to 10 29922 1726853677.10333: Set connection var ansible_shell_type to sh 29922 1726853677.10360: variable 'ansible_shell_executable' from source: unknown 29922 1726853677.10367: variable 'ansible_connection' from source: unknown 29922 1726853677.10377: variable 'ansible_module_compression' from source: unknown 29922 1726853677.10384: variable 'ansible_shell_type' from source: unknown 29922 1726853677.10394: variable 'ansible_shell_executable' from source: unknown 29922 1726853677.10401: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853677.10408: variable 'ansible_pipelining' from source: unknown 29922 1726853677.10415: variable 'ansible_timeout' from source: unknown 29922 1726853677.10422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853677.10624: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853677.10644: variable 'omit' from source: magic vars 29922 1726853677.10720: starting attempt loop 29922 1726853677.10723: running the handler 29922 1726853677.10726: _low_level_execute_command(): starting 29922 1726853677.10728: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853677.11385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853677.11400: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853677.11459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853677.11538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853677.11575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853677.11670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853677.13624: stdout chunk (state=3): >>>/root <<< 29922 1726853677.13701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853677.13705: stdout chunk (state=3): >>><<< 29922 1726853677.13707: stderr chunk (state=3): >>><<< 29922 1726853677.13733: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853677.13753: _low_level_execute_command(): starting 29922 1726853677.13764: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853677.1374037-31144-191652803177910 `" && echo ansible-tmp-1726853677.1374037-31144-191652803177910="` echo /root/.ansible/tmp/ansible-tmp-1726853677.1374037-31144-191652803177910 `" ) && sleep 0' 29922 1726853677.14415: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853677.14433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853677.14477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853677.14588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853677.14592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853677.14602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853677.14618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853677.14714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853677.16661: stdout chunk (state=3): >>>ansible-tmp-1726853677.1374037-31144-191652803177910=/root/.ansible/tmp/ansible-tmp-1726853677.1374037-31144-191652803177910 <<< 29922 1726853677.16806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853677.16820: stdout chunk (state=3): >>><<< 29922 1726853677.16833: stderr chunk (state=3): >>><<< 29922 1726853677.16865: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853677.1374037-31144-191652803177910=/root/.ansible/tmp/ansible-tmp-1726853677.1374037-31144-191652803177910 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853677.17031: variable 'ansible_module_compression' from source: unknown 29922 1726853677.17035: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 29922 1726853677.17081: variable 'ansible_facts' from source: unknown 29922 1726853677.17295: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853677.1374037-31144-191652803177910/AnsiballZ_package_facts.py 29922 1726853677.17506: Sending initial data 29922 1726853677.17509: Sent initial data (162 bytes) 29922 1726853677.18210: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853677.18279: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853677.18315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853677.19919: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 29922 1726853677.19942: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853677.20016: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853677.20101: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpxyvv33bo /root/.ansible/tmp/ansible-tmp-1726853677.1374037-31144-191652803177910/AnsiballZ_package_facts.py <<< 29922 1726853677.20112: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853677.1374037-31144-191652803177910/AnsiballZ_package_facts.py" <<< 29922 1726853677.20162: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpxyvv33bo" to remote "/root/.ansible/tmp/ansible-tmp-1726853677.1374037-31144-191652803177910/AnsiballZ_package_facts.py" <<< 29922 1726853677.20187: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853677.1374037-31144-191652803177910/AnsiballZ_package_facts.py" <<< 29922 1726853677.21978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853677.21981: stdout chunk (state=3): >>><<< 29922 1726853677.21984: stderr chunk (state=3): >>><<< 29922 1726853677.21986: done transferring module to remote 29922 1726853677.21988: _low_level_execute_command(): starting 29922 1726853677.21990: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853677.1374037-31144-191652803177910/ /root/.ansible/tmp/ansible-tmp-1726853677.1374037-31144-191652803177910/AnsiballZ_package_facts.py && sleep 0' 29922 1726853677.22569: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853677.22584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853677.22598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853677.22664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853677.22718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853677.22747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853677.22769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853677.22851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853677.25036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853677.25040: stdout chunk (state=3): >>><<< 29922 1726853677.25042: stderr chunk (state=3): >>><<< 29922 1726853677.25044: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853677.25047: _low_level_execute_command(): starting 29922 1726853677.25049: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853677.1374037-31144-191652803177910/AnsiballZ_package_facts.py && sleep 0' 29922 1726853677.25782: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853677.25797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853677.25811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853677.25826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853677.25907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853677.25976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853677.25993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853677.26013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853677.26219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853677.70878: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 29922 1726853677.71210: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 29922 1726853677.72932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853677.72935: stdout chunk (state=3): >>><<< 29922 1726853677.72938: stderr chunk (state=3): >>><<< 29922 1726853677.73180: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853677.75215: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853677.1374037-31144-191652803177910/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853677.75244: _low_level_execute_command(): starting 29922 1726853677.75253: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853677.1374037-31144-191652803177910/ > /dev/null 2>&1 && sleep 0' 29922 1726853677.75931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853677.75935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853677.75937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853677.75939: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853677.75941: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853677.75943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853677.76007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853677.76102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853677.78037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853677.78052: stdout chunk (state=3): >>><<< 29922 1726853677.78064: stderr chunk (state=3): >>><<< 29922 1726853677.78085: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853677.78095: handler run complete 29922 1726853677.78939: variable 'ansible_facts' from source: unknown 29922 1726853677.79421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853677.81383: variable 'ansible_facts' from source: unknown 29922 1726853677.81912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853677.82656: attempt loop complete, returning result 29922 1726853677.82876: _execute() done 29922 1726853677.82879: dumping result to json 29922 1726853677.82899: done dumping result, returning 29922 1726853677.82912: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-51d4-513b-0000000004f8] 29922 1726853677.82920: sending task result for task 02083763-bbaf-51d4-513b-0000000004f8 29922 1726853677.85329: done sending task result for task 02083763-bbaf-51d4-513b-0000000004f8 29922 1726853677.85332: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29922 1726853677.85491: no more pending results, returning what we have 29922 1726853677.85494: results queue empty 29922 1726853677.85495: checking for any_errors_fatal 29922 1726853677.85499: done checking for any_errors_fatal 29922 1726853677.85500: checking for max_fail_percentage 29922 1726853677.85501: done checking for max_fail_percentage 29922 1726853677.85502: checking to see if all hosts have failed and the running result is not ok 29922 1726853677.85503: done checking to see if all hosts have failed 29922 1726853677.85504: getting the remaining hosts for this loop 29922 1726853677.85505: done getting the remaining hosts for this loop 29922 1726853677.85508: getting the next task for host managed_node3 29922 1726853677.85514: done getting next task for host managed_node3 29922 1726853677.85517: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 29922 1726853677.85519: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853677.85529: getting variables 29922 1726853677.85530: in VariableManager get_vars() 29922 1726853677.85559: Calling all_inventory to load vars for managed_node3 29922 1726853677.85562: Calling groups_inventory to load vars for managed_node3 29922 1726853677.85565: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853677.85777: Calling all_plugins_play to load vars for managed_node3 29922 1726853677.85781: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853677.85785: Calling groups_plugins_play to load vars for managed_node3 29922 1726853677.88008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853677.90210: done with get_vars() 29922 1726853677.90240: done getting variables 29922 1726853677.90438: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:34:37 -0400 (0:00:00.820) 0:00:26.834 ****** 29922 1726853677.90516: entering _queue_task() for managed_node3/debug 29922 1726853677.91323: worker is 1 (out of 1 available) 29922 1726853677.91336: exiting _queue_task() for managed_node3/debug 29922 1726853677.91348: done queuing things up, now waiting for results queue to drain 29922 1726853677.91350: waiting for pending results... 29922 1726853677.91702: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 29922 1726853677.91736: in run() - task 02083763-bbaf-51d4-513b-000000000072 29922 1726853677.91763: variable 'ansible_search_path' from source: unknown 29922 1726853677.91779: variable 'ansible_search_path' from source: unknown 29922 1726853677.91807: calling self._execute() 29922 1726853677.91898: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853677.91903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853677.91912: variable 'omit' from source: magic vars 29922 1726853677.92203: variable 'ansible_distribution_major_version' from source: facts 29922 1726853677.92215: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853677.92221: variable 'omit' from source: magic vars 29922 1726853677.92253: variable 'omit' from source: magic vars 29922 1726853677.92343: variable 'network_provider' from source: set_fact 29922 1726853677.92348: variable 'omit' from source: magic vars 29922 1726853677.92373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853677.92400: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853677.92418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853677.92432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853677.92443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853677.92478: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853677.92482: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853677.92484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853677.92545: Set connection var ansible_connection to ssh 29922 1726853677.92552: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853677.92563: Set connection var ansible_shell_executable to /bin/sh 29922 1726853677.92570: Set connection var ansible_pipelining to False 29922 1726853677.92579: Set connection var ansible_timeout to 10 29922 1726853677.92582: Set connection var ansible_shell_type to sh 29922 1726853677.92599: variable 'ansible_shell_executable' from source: unknown 29922 1726853677.92602: variable 'ansible_connection' from source: unknown 29922 1726853677.92604: variable 'ansible_module_compression' from source: unknown 29922 1726853677.92607: variable 'ansible_shell_type' from source: unknown 29922 1726853677.92609: variable 'ansible_shell_executable' from source: unknown 29922 1726853677.92611: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853677.92614: variable 'ansible_pipelining' from source: unknown 29922 1726853677.92616: variable 'ansible_timeout' from source: unknown 29922 1726853677.92620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853677.92723: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853677.92731: variable 'omit' from source: magic vars 29922 1726853677.92736: starting attempt loop 29922 1726853677.92739: running the handler 29922 1726853677.92775: handler run complete 29922 1726853677.92821: attempt loop complete, returning result 29922 1726853677.92824: _execute() done 29922 1726853677.92827: dumping result to json 29922 1726853677.92829: done dumping result, returning 29922 1726853677.92831: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-51d4-513b-000000000072] 29922 1726853677.92834: sending task result for task 02083763-bbaf-51d4-513b-000000000072 29922 1726853677.93009: done sending task result for task 02083763-bbaf-51d4-513b-000000000072 29922 1726853677.93012: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 29922 1726853677.93196: no more pending results, returning what we have 29922 1726853677.93199: results queue empty 29922 1726853677.93199: checking for any_errors_fatal 29922 1726853677.93205: done checking for any_errors_fatal 29922 1726853677.93206: checking for max_fail_percentage 29922 1726853677.93207: done checking for max_fail_percentage 29922 1726853677.93208: checking to see if all hosts have failed and the running result is not ok 29922 1726853677.93209: done checking to see if all hosts have failed 29922 1726853677.93210: getting the remaining hosts for this loop 29922 1726853677.93211: done getting the remaining hosts for this loop 29922 1726853677.93214: getting the next task for host managed_node3 29922 1726853677.93218: done getting next task for host managed_node3 29922 1726853677.93221: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 29922 1726853677.93223: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853677.93233: getting variables 29922 1726853677.93234: in VariableManager get_vars() 29922 1726853677.93276: Calling all_inventory to load vars for managed_node3 29922 1726853677.93279: Calling groups_inventory to load vars for managed_node3 29922 1726853677.93281: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853677.93289: Calling all_plugins_play to load vars for managed_node3 29922 1726853677.93292: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853677.93295: Calling groups_plugins_play to load vars for managed_node3 29922 1726853677.94563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853677.97031: done with get_vars() 29922 1726853677.97061: done getting variables 29922 1726853677.97121: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:34:37 -0400 (0:00:00.066) 0:00:26.901 ****** 29922 1726853677.97158: entering _queue_task() for managed_node3/fail 29922 1726853677.97522: worker is 1 (out of 1 available) 29922 1726853677.97536: exiting _queue_task() for managed_node3/fail 29922 1726853677.97547: done queuing things up, now waiting for results queue to drain 29922 1726853677.97548: waiting for pending results... 29922 1726853677.97910: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 29922 1726853677.97916: in run() - task 02083763-bbaf-51d4-513b-000000000073 29922 1726853677.97920: variable 'ansible_search_path' from source: unknown 29922 1726853677.97923: variable 'ansible_search_path' from source: unknown 29922 1726853677.97958: calling self._execute() 29922 1726853677.98163: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853677.98180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853677.98194: variable 'omit' from source: magic vars 29922 1726853677.98560: variable 'ansible_distribution_major_version' from source: facts 29922 1726853677.98580: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853677.98698: variable 'network_state' from source: role '' defaults 29922 1726853677.98712: Evaluated conditional (network_state != {}): False 29922 1726853677.98719: when evaluation is False, skipping this task 29922 1726853677.98725: _execute() done 29922 1726853677.98731: dumping result to json 29922 1726853677.98736: done dumping result, returning 29922 1726853677.98746: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-51d4-513b-000000000073] 29922 1726853677.98762: sending task result for task 02083763-bbaf-51d4-513b-000000000073 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29922 1726853677.98933: no more pending results, returning what we have 29922 1726853677.98936: results queue empty 29922 1726853677.98937: checking for any_errors_fatal 29922 1726853677.98946: done checking for any_errors_fatal 29922 1726853677.98947: checking for max_fail_percentage 29922 1726853677.98948: done checking for max_fail_percentage 29922 1726853677.98949: checking to see if all hosts have failed and the running result is not ok 29922 1726853677.98950: done checking to see if all hosts have failed 29922 1726853677.98950: getting the remaining hosts for this loop 29922 1726853677.98951: done getting the remaining hosts for this loop 29922 1726853677.98957: getting the next task for host managed_node3 29922 1726853677.98962: done getting next task for host managed_node3 29922 1726853677.98966: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 29922 1726853677.98968: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853677.98992: getting variables 29922 1726853677.98994: in VariableManager get_vars() 29922 1726853677.99029: Calling all_inventory to load vars for managed_node3 29922 1726853677.99033: Calling groups_inventory to load vars for managed_node3 29922 1726853677.99035: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853677.99047: Calling all_plugins_play to load vars for managed_node3 29922 1726853677.99050: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853677.99053: Calling groups_plugins_play to load vars for managed_node3 29922 1726853677.99580: done sending task result for task 02083763-bbaf-51d4-513b-000000000073 29922 1726853677.99584: WORKER PROCESS EXITING 29922 1726853678.00956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853678.02610: done with get_vars() 29922 1726853678.02634: done getting variables 29922 1726853678.02690: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:34:38 -0400 (0:00:00.055) 0:00:26.956 ****** 29922 1726853678.02716: entering _queue_task() for managed_node3/fail 29922 1726853678.03122: worker is 1 (out of 1 available) 29922 1726853678.03135: exiting _queue_task() for managed_node3/fail 29922 1726853678.03148: done queuing things up, now waiting for results queue to drain 29922 1726853678.03149: waiting for pending results... 29922 1726853678.03514: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 29922 1726853678.03678: in run() - task 02083763-bbaf-51d4-513b-000000000074 29922 1726853678.03688: variable 'ansible_search_path' from source: unknown 29922 1726853678.03691: variable 'ansible_search_path' from source: unknown 29922 1726853678.03731: calling self._execute() 29922 1726853678.03876: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853678.03880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853678.03883: variable 'omit' from source: magic vars 29922 1726853678.04275: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.04295: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853678.04449: variable 'network_state' from source: role '' defaults 29922 1726853678.04453: Evaluated conditional (network_state != {}): False 29922 1726853678.04455: when evaluation is False, skipping this task 29922 1726853678.04457: _execute() done 29922 1726853678.04460: dumping result to json 29922 1726853678.04462: done dumping result, returning 29922 1726853678.04467: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-51d4-513b-000000000074] 29922 1726853678.04480: sending task result for task 02083763-bbaf-51d4-513b-000000000074 29922 1726853678.04779: done sending task result for task 02083763-bbaf-51d4-513b-000000000074 29922 1726853678.04783: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29922 1726853678.04829: no more pending results, returning what we have 29922 1726853678.04833: results queue empty 29922 1726853678.04834: checking for any_errors_fatal 29922 1726853678.04841: done checking for any_errors_fatal 29922 1726853678.04842: checking for max_fail_percentage 29922 1726853678.04844: done checking for max_fail_percentage 29922 1726853678.04845: checking to see if all hosts have failed and the running result is not ok 29922 1726853678.04846: done checking to see if all hosts have failed 29922 1726853678.04846: getting the remaining hosts for this loop 29922 1726853678.04848: done getting the remaining hosts for this loop 29922 1726853678.04851: getting the next task for host managed_node3 29922 1726853678.04856: done getting next task for host managed_node3 29922 1726853678.04860: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 29922 1726853678.04863: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853678.04986: getting variables 29922 1726853678.04988: in VariableManager get_vars() 29922 1726853678.05023: Calling all_inventory to load vars for managed_node3 29922 1726853678.05026: Calling groups_inventory to load vars for managed_node3 29922 1726853678.05028: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853678.05039: Calling all_plugins_play to load vars for managed_node3 29922 1726853678.05042: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853678.05045: Calling groups_plugins_play to load vars for managed_node3 29922 1726853678.06446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853678.08025: done with get_vars() 29922 1726853678.08048: done getting variables 29922 1726853678.08106: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:34:38 -0400 (0:00:00.054) 0:00:27.010 ****** 29922 1726853678.08134: entering _queue_task() for managed_node3/fail 29922 1726853678.08439: worker is 1 (out of 1 available) 29922 1726853678.08451: exiting _queue_task() for managed_node3/fail 29922 1726853678.08462: done queuing things up, now waiting for results queue to drain 29922 1726853678.08463: waiting for pending results... 29922 1726853678.08789: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 29922 1726853678.08847: in run() - task 02083763-bbaf-51d4-513b-000000000075 29922 1726853678.08867: variable 'ansible_search_path' from source: unknown 29922 1726853678.08877: variable 'ansible_search_path' from source: unknown 29922 1726853678.09076: calling self._execute() 29922 1726853678.09080: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853678.09083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853678.09085: variable 'omit' from source: magic vars 29922 1726853678.09426: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.09443: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853678.09618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853678.12118: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853678.12191: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853678.12228: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853678.12267: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853678.12297: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853678.12384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.12415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.12454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.12495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.12564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.12615: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.12637: Evaluated conditional (ansible_distribution_major_version | int > 9): True 29922 1726853678.12757: variable 'ansible_distribution' from source: facts 29922 1726853678.12767: variable '__network_rh_distros' from source: role '' defaults 29922 1726853678.12785: Evaluated conditional (ansible_distribution in __network_rh_distros): True 29922 1726853678.13056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.13089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.13123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.13212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.13216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.13240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.13267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.13297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.13345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.13363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.13409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.13676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.13679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.13681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.13683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.13834: variable 'network_connections' from source: play vars 29922 1726853678.13850: variable 'profile' from source: play vars 29922 1726853678.13924: variable 'profile' from source: play vars 29922 1726853678.13933: variable 'interface' from source: set_fact 29922 1726853678.13995: variable 'interface' from source: set_fact 29922 1726853678.14012: variable 'network_state' from source: role '' defaults 29922 1726853678.14126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853678.14249: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853678.14291: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853678.14322: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853678.14357: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853678.14403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853678.14436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853678.14561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.14564: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853678.14567: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 29922 1726853678.14569: when evaluation is False, skipping this task 29922 1726853678.14573: _execute() done 29922 1726853678.14575: dumping result to json 29922 1726853678.14577: done dumping result, returning 29922 1726853678.14579: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-51d4-513b-000000000075] 29922 1726853678.14582: sending task result for task 02083763-bbaf-51d4-513b-000000000075 skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 29922 1726853678.14711: no more pending results, returning what we have 29922 1726853678.14714: results queue empty 29922 1726853678.14715: checking for any_errors_fatal 29922 1726853678.14721: done checking for any_errors_fatal 29922 1726853678.14722: checking for max_fail_percentage 29922 1726853678.14724: done checking for max_fail_percentage 29922 1726853678.14725: checking to see if all hosts have failed and the running result is not ok 29922 1726853678.14726: done checking to see if all hosts have failed 29922 1726853678.14726: getting the remaining hosts for this loop 29922 1726853678.14728: done getting the remaining hosts for this loop 29922 1726853678.14731: getting the next task for host managed_node3 29922 1726853678.14737: done getting next task for host managed_node3 29922 1726853678.14741: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 29922 1726853678.14743: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853678.14758: getting variables 29922 1726853678.14759: in VariableManager get_vars() 29922 1726853678.14799: Calling all_inventory to load vars for managed_node3 29922 1726853678.14801: Calling groups_inventory to load vars for managed_node3 29922 1726853678.14804: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853678.14814: Calling all_plugins_play to load vars for managed_node3 29922 1726853678.14816: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853678.14819: Calling groups_plugins_play to load vars for managed_node3 29922 1726853678.15712: done sending task result for task 02083763-bbaf-51d4-513b-000000000075 29922 1726853678.15715: WORKER PROCESS EXITING 29922 1726853678.16615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853678.18203: done with get_vars() 29922 1726853678.18225: done getting variables 29922 1726853678.18284: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:34:38 -0400 (0:00:00.101) 0:00:27.112 ****** 29922 1726853678.18312: entering _queue_task() for managed_node3/dnf 29922 1726853678.18634: worker is 1 (out of 1 available) 29922 1726853678.18646: exiting _queue_task() for managed_node3/dnf 29922 1726853678.18656: done queuing things up, now waiting for results queue to drain 29922 1726853678.18658: waiting for pending results... 29922 1726853678.18934: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 29922 1726853678.19048: in run() - task 02083763-bbaf-51d4-513b-000000000076 29922 1726853678.19067: variable 'ansible_search_path' from source: unknown 29922 1726853678.19077: variable 'ansible_search_path' from source: unknown 29922 1726853678.19122: calling self._execute() 29922 1726853678.19233: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853678.19246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853678.19259: variable 'omit' from source: magic vars 29922 1726853678.19636: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.19654: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853678.19853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853678.22131: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853678.22237: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853678.22247: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853678.22287: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853678.22317: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853678.22426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.22480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.22564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.22568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.22570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.22734: variable 'ansible_distribution' from source: facts 29922 1726853678.22978: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.22981: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 29922 1726853678.23024: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853678.23315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.23477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.23480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.23511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.23777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.23780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.23783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.23785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.23834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.23852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.24019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.24046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.24075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.24141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.24163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.24347: variable 'network_connections' from source: play vars 29922 1726853678.24366: variable 'profile' from source: play vars 29922 1726853678.24443: variable 'profile' from source: play vars 29922 1726853678.24452: variable 'interface' from source: set_fact 29922 1726853678.24516: variable 'interface' from source: set_fact 29922 1726853678.24595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853678.24774: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853678.24816: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853678.24851: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853678.24889: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853678.24935: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853678.24965: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853678.25180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.25183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853678.25185: variable '__network_team_connections_defined' from source: role '' defaults 29922 1726853678.25336: variable 'network_connections' from source: play vars 29922 1726853678.25346: variable 'profile' from source: play vars 29922 1726853678.25414: variable 'profile' from source: play vars 29922 1726853678.25422: variable 'interface' from source: set_fact 29922 1726853678.25485: variable 'interface' from source: set_fact 29922 1726853678.25514: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29922 1726853678.25526: when evaluation is False, skipping this task 29922 1726853678.25532: _execute() done 29922 1726853678.25540: dumping result to json 29922 1726853678.25547: done dumping result, returning 29922 1726853678.25558: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-51d4-513b-000000000076] 29922 1726853678.25567: sending task result for task 02083763-bbaf-51d4-513b-000000000076 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29922 1726853678.25870: no more pending results, returning what we have 29922 1726853678.25875: results queue empty 29922 1726853678.25876: checking for any_errors_fatal 29922 1726853678.25883: done checking for any_errors_fatal 29922 1726853678.25884: checking for max_fail_percentage 29922 1726853678.25886: done checking for max_fail_percentage 29922 1726853678.25887: checking to see if all hosts have failed and the running result is not ok 29922 1726853678.25888: done checking to see if all hosts have failed 29922 1726853678.25889: getting the remaining hosts for this loop 29922 1726853678.25890: done getting the remaining hosts for this loop 29922 1726853678.25894: getting the next task for host managed_node3 29922 1726853678.25901: done getting next task for host managed_node3 29922 1726853678.25905: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 29922 1726853678.25907: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853678.25921: getting variables 29922 1726853678.25923: in VariableManager get_vars() 29922 1726853678.25963: Calling all_inventory to load vars for managed_node3 29922 1726853678.25966: Calling groups_inventory to load vars for managed_node3 29922 1726853678.25968: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853678.25987: Calling all_plugins_play to load vars for managed_node3 29922 1726853678.25997: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853678.26057: Calling groups_plugins_play to load vars for managed_node3 29922 1726853678.26585: done sending task result for task 02083763-bbaf-51d4-513b-000000000076 29922 1726853678.26589: WORKER PROCESS EXITING 29922 1726853678.27713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853678.29442: done with get_vars() 29922 1726853678.29465: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 29922 1726853678.29538: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:34:38 -0400 (0:00:00.112) 0:00:27.225 ****** 29922 1726853678.29566: entering _queue_task() for managed_node3/yum 29922 1726853678.29909: worker is 1 (out of 1 available) 29922 1726853678.29920: exiting _queue_task() for managed_node3/yum 29922 1726853678.29932: done queuing things up, now waiting for results queue to drain 29922 1726853678.29933: waiting for pending results... 29922 1726853678.30213: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 29922 1726853678.30320: in run() - task 02083763-bbaf-51d4-513b-000000000077 29922 1726853678.30338: variable 'ansible_search_path' from source: unknown 29922 1726853678.30346: variable 'ansible_search_path' from source: unknown 29922 1726853678.30390: calling self._execute() 29922 1726853678.30501: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853678.30576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853678.30580: variable 'omit' from source: magic vars 29922 1726853678.30923: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.30938: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853678.31116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853678.33293: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853678.33363: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853678.33577: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853678.33581: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853678.33583: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853678.33585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.33608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.33641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.33689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.33713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.33810: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.33828: Evaluated conditional (ansible_distribution_major_version | int < 8): False 29922 1726853678.33836: when evaluation is False, skipping this task 29922 1726853678.33842: _execute() done 29922 1726853678.33847: dumping result to json 29922 1726853678.33853: done dumping result, returning 29922 1726853678.33862: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-51d4-513b-000000000077] 29922 1726853678.33869: sending task result for task 02083763-bbaf-51d4-513b-000000000077 29922 1726853678.34085: done sending task result for task 02083763-bbaf-51d4-513b-000000000077 29922 1726853678.34088: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 29922 1726853678.34137: no more pending results, returning what we have 29922 1726853678.34141: results queue empty 29922 1726853678.34141: checking for any_errors_fatal 29922 1726853678.34149: done checking for any_errors_fatal 29922 1726853678.34149: checking for max_fail_percentage 29922 1726853678.34151: done checking for max_fail_percentage 29922 1726853678.34152: checking to see if all hosts have failed and the running result is not ok 29922 1726853678.34153: done checking to see if all hosts have failed 29922 1726853678.34153: getting the remaining hosts for this loop 29922 1726853678.34155: done getting the remaining hosts for this loop 29922 1726853678.34158: getting the next task for host managed_node3 29922 1726853678.34165: done getting next task for host managed_node3 29922 1726853678.34168: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 29922 1726853678.34170: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853678.34186: getting variables 29922 1726853678.34187: in VariableManager get_vars() 29922 1726853678.34228: Calling all_inventory to load vars for managed_node3 29922 1726853678.34232: Calling groups_inventory to load vars for managed_node3 29922 1726853678.34235: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853678.34248: Calling all_plugins_play to load vars for managed_node3 29922 1726853678.34251: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853678.34254: Calling groups_plugins_play to load vars for managed_node3 29922 1726853678.35848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853678.37518: done with get_vars() 29922 1726853678.37550: done getting variables 29922 1726853678.37615: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:34:38 -0400 (0:00:00.080) 0:00:27.306 ****** 29922 1726853678.37661: entering _queue_task() for managed_node3/fail 29922 1726853678.38191: worker is 1 (out of 1 available) 29922 1726853678.38202: exiting _queue_task() for managed_node3/fail 29922 1726853678.38215: done queuing things up, now waiting for results queue to drain 29922 1726853678.38216: waiting for pending results... 29922 1726853678.38482: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 29922 1726853678.38488: in run() - task 02083763-bbaf-51d4-513b-000000000078 29922 1726853678.38491: variable 'ansible_search_path' from source: unknown 29922 1726853678.38494: variable 'ansible_search_path' from source: unknown 29922 1726853678.38530: calling self._execute() 29922 1726853678.38637: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853678.38651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853678.38667: variable 'omit' from source: magic vars 29922 1726853678.39074: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.39092: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853678.39214: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853678.39424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853678.42435: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853678.42589: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853678.42677: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853678.42701: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853678.42737: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853678.42851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.43217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.43251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.43303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.43324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.43382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.43415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.43447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.43674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.43679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.43682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.43685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.43687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.43689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.43691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.43848: variable 'network_connections' from source: play vars 29922 1726853678.43868: variable 'profile' from source: play vars 29922 1726853678.43952: variable 'profile' from source: play vars 29922 1726853678.43963: variable 'interface' from source: set_fact 29922 1726853678.44037: variable 'interface' from source: set_fact 29922 1726853678.44116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853678.44296: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853678.44363: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853678.44525: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853678.44528: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853678.44599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853678.44627: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853678.44677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.44695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853678.44746: variable '__network_team_connections_defined' from source: role '' defaults 29922 1726853678.45328: variable 'network_connections' from source: play vars 29922 1726853678.45331: variable 'profile' from source: play vars 29922 1726853678.45477: variable 'profile' from source: play vars 29922 1726853678.45480: variable 'interface' from source: set_fact 29922 1726853678.45550: variable 'interface' from source: set_fact 29922 1726853678.45581: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29922 1726853678.45661: when evaluation is False, skipping this task 29922 1726853678.45668: _execute() done 29922 1726853678.45676: dumping result to json 29922 1726853678.45683: done dumping result, returning 29922 1726853678.45695: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-51d4-513b-000000000078] 29922 1726853678.45713: sending task result for task 02083763-bbaf-51d4-513b-000000000078 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29922 1726853678.46023: no more pending results, returning what we have 29922 1726853678.46027: results queue empty 29922 1726853678.46028: checking for any_errors_fatal 29922 1726853678.46036: done checking for any_errors_fatal 29922 1726853678.46037: checking for max_fail_percentage 29922 1726853678.46038: done checking for max_fail_percentage 29922 1726853678.46040: checking to see if all hosts have failed and the running result is not ok 29922 1726853678.46040: done checking to see if all hosts have failed 29922 1726853678.46041: getting the remaining hosts for this loop 29922 1726853678.46042: done getting the remaining hosts for this loop 29922 1726853678.46046: getting the next task for host managed_node3 29922 1726853678.46051: done getting next task for host managed_node3 29922 1726853678.46055: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 29922 1726853678.46058: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853678.46074: getting variables 29922 1726853678.46076: in VariableManager get_vars() 29922 1726853678.46115: Calling all_inventory to load vars for managed_node3 29922 1726853678.46118: Calling groups_inventory to load vars for managed_node3 29922 1726853678.46120: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853678.46130: Calling all_plugins_play to load vars for managed_node3 29922 1726853678.46132: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853678.46135: Calling groups_plugins_play to load vars for managed_node3 29922 1726853678.46807: done sending task result for task 02083763-bbaf-51d4-513b-000000000078 29922 1726853678.46810: WORKER PROCESS EXITING 29922 1726853678.48069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853678.49752: done with get_vars() 29922 1726853678.49783: done getting variables 29922 1726853678.49847: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:34:38 -0400 (0:00:00.122) 0:00:27.428 ****** 29922 1726853678.49888: entering _queue_task() for managed_node3/package 29922 1726853678.50382: worker is 1 (out of 1 available) 29922 1726853678.50393: exiting _queue_task() for managed_node3/package 29922 1726853678.50405: done queuing things up, now waiting for results queue to drain 29922 1726853678.50406: waiting for pending results... 29922 1726853678.50668: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 29922 1726853678.50823: in run() - task 02083763-bbaf-51d4-513b-000000000079 29922 1726853678.50868: variable 'ansible_search_path' from source: unknown 29922 1726853678.50886: variable 'ansible_search_path' from source: unknown 29922 1726853678.50928: calling self._execute() 29922 1726853678.51047: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853678.51064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853678.51105: variable 'omit' from source: magic vars 29922 1726853678.51563: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.51651: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853678.51847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853678.52253: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853678.52344: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853678.52388: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853678.52483: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853678.52650: variable 'network_packages' from source: role '' defaults 29922 1726853678.52840: variable '__network_provider_setup' from source: role '' defaults 29922 1726853678.52844: variable '__network_service_name_default_nm' from source: role '' defaults 29922 1726853678.52886: variable '__network_service_name_default_nm' from source: role '' defaults 29922 1726853678.52899: variable '__network_packages_default_nm' from source: role '' defaults 29922 1726853678.52968: variable '__network_packages_default_nm' from source: role '' defaults 29922 1726853678.53278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853678.55288: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853678.55334: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853678.55369: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853678.55398: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853678.55418: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853678.55524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.55527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.55553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.55645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.55650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.55653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.55676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.55690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.55727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.55741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.55961: variable '__network_packages_default_gobject_packages' from source: role '' defaults 29922 1726853678.56079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.56085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.56176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.56179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.56187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.56239: variable 'ansible_python' from source: facts 29922 1726853678.56264: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 29922 1726853678.56347: variable '__network_wpa_supplicant_required' from source: role '' defaults 29922 1726853678.56436: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29922 1726853678.56552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.56578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.56595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.56641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.56648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.56684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.56704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.56720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.56749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.56759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.56861: variable 'network_connections' from source: play vars 29922 1726853678.56865: variable 'profile' from source: play vars 29922 1726853678.56932: variable 'profile' from source: play vars 29922 1726853678.56937: variable 'interface' from source: set_fact 29922 1726853678.57070: variable 'interface' from source: set_fact 29922 1726853678.57076: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853678.57177: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853678.57180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.57183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853678.57282: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853678.57547: variable 'network_connections' from source: play vars 29922 1726853678.57562: variable 'profile' from source: play vars 29922 1726853678.57679: variable 'profile' from source: play vars 29922 1726853678.57692: variable 'interface' from source: set_fact 29922 1726853678.57826: variable 'interface' from source: set_fact 29922 1726853678.57866: variable '__network_packages_default_wireless' from source: role '' defaults 29922 1726853678.57968: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853678.58168: variable 'network_connections' from source: play vars 29922 1726853678.58175: variable 'profile' from source: play vars 29922 1726853678.58221: variable 'profile' from source: play vars 29922 1726853678.58225: variable 'interface' from source: set_fact 29922 1726853678.58296: variable 'interface' from source: set_fact 29922 1726853678.58314: variable '__network_packages_default_team' from source: role '' defaults 29922 1726853678.58369: variable '__network_team_connections_defined' from source: role '' defaults 29922 1726853678.58560: variable 'network_connections' from source: play vars 29922 1726853678.58563: variable 'profile' from source: play vars 29922 1726853678.58610: variable 'profile' from source: play vars 29922 1726853678.58613: variable 'interface' from source: set_fact 29922 1726853678.58701: variable 'interface' from source: set_fact 29922 1726853678.58732: variable '__network_service_name_default_initscripts' from source: role '' defaults 29922 1726853678.58833: variable '__network_service_name_default_initscripts' from source: role '' defaults 29922 1726853678.58836: variable '__network_packages_default_initscripts' from source: role '' defaults 29922 1726853678.58881: variable '__network_packages_default_initscripts' from source: role '' defaults 29922 1726853678.59069: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 29922 1726853678.59535: variable 'network_connections' from source: play vars 29922 1726853678.59546: variable 'profile' from source: play vars 29922 1726853678.59620: variable 'profile' from source: play vars 29922 1726853678.59628: variable 'interface' from source: set_fact 29922 1726853678.59696: variable 'interface' from source: set_fact 29922 1726853678.59709: variable 'ansible_distribution' from source: facts 29922 1726853678.59725: variable '__network_rh_distros' from source: role '' defaults 29922 1726853678.59735: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.59752: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 29922 1726853678.59928: variable 'ansible_distribution' from source: facts 29922 1726853678.59964: variable '__network_rh_distros' from source: role '' defaults 29922 1726853678.60017: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.60024: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 29922 1726853678.60208: variable 'ansible_distribution' from source: facts 29922 1726853678.60211: variable '__network_rh_distros' from source: role '' defaults 29922 1726853678.60213: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.60238: variable 'network_provider' from source: set_fact 29922 1726853678.60249: variable 'ansible_facts' from source: unknown 29922 1726853678.60705: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 29922 1726853678.60710: when evaluation is False, skipping this task 29922 1726853678.60712: _execute() done 29922 1726853678.60714: dumping result to json 29922 1726853678.60717: done dumping result, returning 29922 1726853678.60720: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-51d4-513b-000000000079] 29922 1726853678.60722: sending task result for task 02083763-bbaf-51d4-513b-000000000079 29922 1726853678.60811: done sending task result for task 02083763-bbaf-51d4-513b-000000000079 29922 1726853678.60814: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 29922 1726853678.60864: no more pending results, returning what we have 29922 1726853678.60867: results queue empty 29922 1726853678.60868: checking for any_errors_fatal 29922 1726853678.60877: done checking for any_errors_fatal 29922 1726853678.60878: checking for max_fail_percentage 29922 1726853678.60879: done checking for max_fail_percentage 29922 1726853678.60880: checking to see if all hosts have failed and the running result is not ok 29922 1726853678.60881: done checking to see if all hosts have failed 29922 1726853678.60881: getting the remaining hosts for this loop 29922 1726853678.60883: done getting the remaining hosts for this loop 29922 1726853678.60886: getting the next task for host managed_node3 29922 1726853678.60891: done getting next task for host managed_node3 29922 1726853678.60895: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 29922 1726853678.60897: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853678.60910: getting variables 29922 1726853678.60912: in VariableManager get_vars() 29922 1726853678.60949: Calling all_inventory to load vars for managed_node3 29922 1726853678.60951: Calling groups_inventory to load vars for managed_node3 29922 1726853678.60953: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853678.60969: Calling all_plugins_play to load vars for managed_node3 29922 1726853678.60979: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853678.60982: Calling groups_plugins_play to load vars for managed_node3 29922 1726853678.61818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853678.63131: done with get_vars() 29922 1726853678.63155: done getting variables 29922 1726853678.63217: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:34:38 -0400 (0:00:00.133) 0:00:27.562 ****** 29922 1726853678.63248: entering _queue_task() for managed_node3/package 29922 1726853678.63564: worker is 1 (out of 1 available) 29922 1726853678.63578: exiting _queue_task() for managed_node3/package 29922 1726853678.63590: done queuing things up, now waiting for results queue to drain 29922 1726853678.63592: waiting for pending results... 29922 1726853678.63916: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 29922 1726853678.63955: in run() - task 02083763-bbaf-51d4-513b-00000000007a 29922 1726853678.63992: variable 'ansible_search_path' from source: unknown 29922 1726853678.63996: variable 'ansible_search_path' from source: unknown 29922 1726853678.64016: calling self._execute() 29922 1726853678.64095: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853678.64100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853678.64109: variable 'omit' from source: magic vars 29922 1726853678.64394: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.64403: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853678.64488: variable 'network_state' from source: role '' defaults 29922 1726853678.64495: Evaluated conditional (network_state != {}): False 29922 1726853678.64498: when evaluation is False, skipping this task 29922 1726853678.64501: _execute() done 29922 1726853678.64504: dumping result to json 29922 1726853678.64506: done dumping result, returning 29922 1726853678.64514: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-51d4-513b-00000000007a] 29922 1726853678.64517: sending task result for task 02083763-bbaf-51d4-513b-00000000007a 29922 1726853678.64608: done sending task result for task 02083763-bbaf-51d4-513b-00000000007a 29922 1726853678.64611: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29922 1726853678.64653: no more pending results, returning what we have 29922 1726853678.64659: results queue empty 29922 1726853678.64660: checking for any_errors_fatal 29922 1726853678.64668: done checking for any_errors_fatal 29922 1726853678.64669: checking for max_fail_percentage 29922 1726853678.64672: done checking for max_fail_percentage 29922 1726853678.64673: checking to see if all hosts have failed and the running result is not ok 29922 1726853678.64674: done checking to see if all hosts have failed 29922 1726853678.64674: getting the remaining hosts for this loop 29922 1726853678.64675: done getting the remaining hosts for this loop 29922 1726853678.64679: getting the next task for host managed_node3 29922 1726853678.64684: done getting next task for host managed_node3 29922 1726853678.64688: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 29922 1726853678.64690: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853678.64706: getting variables 29922 1726853678.64708: in VariableManager get_vars() 29922 1726853678.64739: Calling all_inventory to load vars for managed_node3 29922 1726853678.64742: Calling groups_inventory to load vars for managed_node3 29922 1726853678.64744: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853678.64752: Calling all_plugins_play to load vars for managed_node3 29922 1726853678.64757: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853678.64760: Calling groups_plugins_play to load vars for managed_node3 29922 1726853678.65653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853678.67017: done with get_vars() 29922 1726853678.67036: done getting variables 29922 1726853678.67080: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:34:38 -0400 (0:00:00.038) 0:00:27.600 ****** 29922 1726853678.67103: entering _queue_task() for managed_node3/package 29922 1726853678.67341: worker is 1 (out of 1 available) 29922 1726853678.67355: exiting _queue_task() for managed_node3/package 29922 1726853678.67367: done queuing things up, now waiting for results queue to drain 29922 1726853678.67368: waiting for pending results... 29922 1726853678.67544: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 29922 1726853678.67622: in run() - task 02083763-bbaf-51d4-513b-00000000007b 29922 1726853678.67633: variable 'ansible_search_path' from source: unknown 29922 1726853678.67636: variable 'ansible_search_path' from source: unknown 29922 1726853678.67666: calling self._execute() 29922 1726853678.67740: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853678.67744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853678.67753: variable 'omit' from source: magic vars 29922 1726853678.68028: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.68039: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853678.68118: variable 'network_state' from source: role '' defaults 29922 1726853678.68126: Evaluated conditional (network_state != {}): False 29922 1726853678.68129: when evaluation is False, skipping this task 29922 1726853678.68132: _execute() done 29922 1726853678.68135: dumping result to json 29922 1726853678.68139: done dumping result, returning 29922 1726853678.68150: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-51d4-513b-00000000007b] 29922 1726853678.68153: sending task result for task 02083763-bbaf-51d4-513b-00000000007b 29922 1726853678.68237: done sending task result for task 02083763-bbaf-51d4-513b-00000000007b 29922 1726853678.68240: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29922 1726853678.68297: no more pending results, returning what we have 29922 1726853678.68301: results queue empty 29922 1726853678.68301: checking for any_errors_fatal 29922 1726853678.68311: done checking for any_errors_fatal 29922 1726853678.68312: checking for max_fail_percentage 29922 1726853678.68313: done checking for max_fail_percentage 29922 1726853678.68314: checking to see if all hosts have failed and the running result is not ok 29922 1726853678.68315: done checking to see if all hosts have failed 29922 1726853678.68315: getting the remaining hosts for this loop 29922 1726853678.68317: done getting the remaining hosts for this loop 29922 1726853678.68320: getting the next task for host managed_node3 29922 1726853678.68325: done getting next task for host managed_node3 29922 1726853678.68329: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 29922 1726853678.68331: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853678.68344: getting variables 29922 1726853678.68346: in VariableManager get_vars() 29922 1726853678.68380: Calling all_inventory to load vars for managed_node3 29922 1726853678.68382: Calling groups_inventory to load vars for managed_node3 29922 1726853678.68384: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853678.68393: Calling all_plugins_play to load vars for managed_node3 29922 1726853678.68395: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853678.68397: Calling groups_plugins_play to load vars for managed_node3 29922 1726853678.69170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853678.70143: done with get_vars() 29922 1726853678.70161: done getting variables 29922 1726853678.70205: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:34:38 -0400 (0:00:00.031) 0:00:27.631 ****** 29922 1726853678.70227: entering _queue_task() for managed_node3/service 29922 1726853678.70463: worker is 1 (out of 1 available) 29922 1726853678.70478: exiting _queue_task() for managed_node3/service 29922 1726853678.70490: done queuing things up, now waiting for results queue to drain 29922 1726853678.70491: waiting for pending results... 29922 1726853678.70661: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 29922 1726853678.70722: in run() - task 02083763-bbaf-51d4-513b-00000000007c 29922 1726853678.70735: variable 'ansible_search_path' from source: unknown 29922 1726853678.70739: variable 'ansible_search_path' from source: unknown 29922 1726853678.70768: calling self._execute() 29922 1726853678.70849: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853678.70854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853678.70863: variable 'omit' from source: magic vars 29922 1726853678.71137: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.71148: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853678.71229: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853678.71360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853678.76751: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853678.76799: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853678.76828: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853678.76849: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853678.76868: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853678.76917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.76939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.76959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.76986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.76996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.77027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.77047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.77065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.77092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.77102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.77129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.77145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.77165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.77190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.77200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.77306: variable 'network_connections' from source: play vars 29922 1726853678.77315: variable 'profile' from source: play vars 29922 1726853678.77364: variable 'profile' from source: play vars 29922 1726853678.77369: variable 'interface' from source: set_fact 29922 1726853678.77416: variable 'interface' from source: set_fact 29922 1726853678.77462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853678.77564: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853678.77602: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853678.77625: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853678.77645: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853678.77675: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853678.77691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853678.77710: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.77728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853678.77759: variable '__network_team_connections_defined' from source: role '' defaults 29922 1726853678.77902: variable 'network_connections' from source: play vars 29922 1726853678.77905: variable 'profile' from source: play vars 29922 1726853678.77950: variable 'profile' from source: play vars 29922 1726853678.77953: variable 'interface' from source: set_fact 29922 1726853678.77995: variable 'interface' from source: set_fact 29922 1726853678.78014: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29922 1726853678.78017: when evaluation is False, skipping this task 29922 1726853678.78020: _execute() done 29922 1726853678.78022: dumping result to json 29922 1726853678.78026: done dumping result, returning 29922 1726853678.78036: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-51d4-513b-00000000007c] 29922 1726853678.78046: sending task result for task 02083763-bbaf-51d4-513b-00000000007c 29922 1726853678.78120: done sending task result for task 02083763-bbaf-51d4-513b-00000000007c 29922 1726853678.78123: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29922 1726853678.78182: no more pending results, returning what we have 29922 1726853678.78185: results queue empty 29922 1726853678.78186: checking for any_errors_fatal 29922 1726853678.78192: done checking for any_errors_fatal 29922 1726853678.78192: checking for max_fail_percentage 29922 1726853678.78194: done checking for max_fail_percentage 29922 1726853678.78195: checking to see if all hosts have failed and the running result is not ok 29922 1726853678.78195: done checking to see if all hosts have failed 29922 1726853678.78196: getting the remaining hosts for this loop 29922 1726853678.78197: done getting the remaining hosts for this loop 29922 1726853678.78201: getting the next task for host managed_node3 29922 1726853678.78206: done getting next task for host managed_node3 29922 1726853678.78209: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 29922 1726853678.78211: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853678.78224: getting variables 29922 1726853678.78225: in VariableManager get_vars() 29922 1726853678.78262: Calling all_inventory to load vars for managed_node3 29922 1726853678.78264: Calling groups_inventory to load vars for managed_node3 29922 1726853678.78266: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853678.78277: Calling all_plugins_play to load vars for managed_node3 29922 1726853678.78279: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853678.78288: Calling groups_plugins_play to load vars for managed_node3 29922 1726853678.82826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853678.83698: done with get_vars() 29922 1726853678.83713: done getting variables 29922 1726853678.83747: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:34:38 -0400 (0:00:00.135) 0:00:27.767 ****** 29922 1726853678.83765: entering _queue_task() for managed_node3/service 29922 1726853678.84025: worker is 1 (out of 1 available) 29922 1726853678.84039: exiting _queue_task() for managed_node3/service 29922 1726853678.84051: done queuing things up, now waiting for results queue to drain 29922 1726853678.84052: waiting for pending results... 29922 1726853678.84388: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 29922 1726853678.84394: in run() - task 02083763-bbaf-51d4-513b-00000000007d 29922 1726853678.84397: variable 'ansible_search_path' from source: unknown 29922 1726853678.84399: variable 'ansible_search_path' from source: unknown 29922 1726853678.84527: calling self._execute() 29922 1726853678.84643: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853678.84659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853678.84678: variable 'omit' from source: magic vars 29922 1726853678.85378: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.85382: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853678.85536: variable 'network_provider' from source: set_fact 29922 1726853678.85591: variable 'network_state' from source: role '' defaults 29922 1726853678.85708: Evaluated conditional (network_provider == "nm" or network_state != {}): True 29922 1726853678.85713: variable 'omit' from source: magic vars 29922 1726853678.85824: variable 'omit' from source: magic vars 29922 1726853678.85897: variable 'network_service_name' from source: role '' defaults 29922 1726853678.85962: variable 'network_service_name' from source: role '' defaults 29922 1726853678.86062: variable '__network_provider_setup' from source: role '' defaults 29922 1726853678.86069: variable '__network_service_name_default_nm' from source: role '' defaults 29922 1726853678.86178: variable '__network_service_name_default_nm' from source: role '' defaults 29922 1726853678.86181: variable '__network_packages_default_nm' from source: role '' defaults 29922 1726853678.86239: variable '__network_packages_default_nm' from source: role '' defaults 29922 1726853678.86446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853678.89087: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853678.89195: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853678.89199: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853678.89208: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853678.89234: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853678.89476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.89481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.89483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.89486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.89488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.89491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.89493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.89498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.89536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.89549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.89878: variable '__network_packages_default_gobject_packages' from source: role '' defaults 29922 1726853678.89895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.89918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.89943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.89982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.89994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.90207: variable 'ansible_python' from source: facts 29922 1726853678.90210: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 29922 1726853678.90212: variable '__network_wpa_supplicant_required' from source: role '' defaults 29922 1726853678.90254: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29922 1726853678.90375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.90400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.90424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.90461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.90475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.90523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853678.90544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853678.90567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.90605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853678.90617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853678.90748: variable 'network_connections' from source: play vars 29922 1726853678.90760: variable 'profile' from source: play vars 29922 1726853678.90826: variable 'profile' from source: play vars 29922 1726853678.90829: variable 'interface' from source: set_fact 29922 1726853678.91076: variable 'interface' from source: set_fact 29922 1726853678.91081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853678.91328: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853678.91375: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853678.91416: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853678.91457: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853678.91515: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853678.91677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853678.91680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853678.91683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853678.91685: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853678.91995: variable 'network_connections' from source: play vars 29922 1726853678.92017: variable 'profile' from source: play vars 29922 1726853678.92070: variable 'profile' from source: play vars 29922 1726853678.92424: variable 'interface' from source: set_fact 29922 1726853678.92427: variable 'interface' from source: set_fact 29922 1726853678.92430: variable '__network_packages_default_wireless' from source: role '' defaults 29922 1726853678.92649: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853678.93034: variable 'network_connections' from source: play vars 29922 1726853678.93038: variable 'profile' from source: play vars 29922 1726853678.93411: variable 'profile' from source: play vars 29922 1726853678.93414: variable 'interface' from source: set_fact 29922 1726853678.93487: variable 'interface' from source: set_fact 29922 1726853678.93511: variable '__network_packages_default_team' from source: role '' defaults 29922 1726853678.93789: variable '__network_team_connections_defined' from source: role '' defaults 29922 1726853678.94259: variable 'network_connections' from source: play vars 29922 1726853678.94262: variable 'profile' from source: play vars 29922 1726853678.94330: variable 'profile' from source: play vars 29922 1726853678.94333: variable 'interface' from source: set_fact 29922 1726853678.94758: variable 'interface' from source: set_fact 29922 1726853678.94812: variable '__network_service_name_default_initscripts' from source: role '' defaults 29922 1726853678.94870: variable '__network_service_name_default_initscripts' from source: role '' defaults 29922 1726853678.94979: variable '__network_packages_default_initscripts' from source: role '' defaults 29922 1726853678.95037: variable '__network_packages_default_initscripts' from source: role '' defaults 29922 1726853678.95457: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 29922 1726853678.96528: variable 'network_connections' from source: play vars 29922 1726853678.96577: variable 'profile' from source: play vars 29922 1726853678.96587: variable 'profile' from source: play vars 29922 1726853678.96594: variable 'interface' from source: set_fact 29922 1726853678.96777: variable 'interface' from source: set_fact 29922 1726853678.96780: variable 'ansible_distribution' from source: facts 29922 1726853678.96782: variable '__network_rh_distros' from source: role '' defaults 29922 1726853678.96784: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.96786: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 29922 1726853678.96954: variable 'ansible_distribution' from source: facts 29922 1726853678.97084: variable '__network_rh_distros' from source: role '' defaults 29922 1726853678.97094: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.97113: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 29922 1726853678.97641: variable 'ansible_distribution' from source: facts 29922 1726853678.97658: variable '__network_rh_distros' from source: role '' defaults 29922 1726853678.97669: variable 'ansible_distribution_major_version' from source: facts 29922 1726853678.97714: variable 'network_provider' from source: set_fact 29922 1726853678.97802: variable 'omit' from source: magic vars 29922 1726853678.97833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853678.97908: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853678.97933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853678.97995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853678.98013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853678.98177: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853678.98180: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853678.98183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853678.98576: Set connection var ansible_connection to ssh 29922 1726853678.98579: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853678.98582: Set connection var ansible_shell_executable to /bin/sh 29922 1726853678.98583: Set connection var ansible_pipelining to False 29922 1726853678.98585: Set connection var ansible_timeout to 10 29922 1726853678.98587: Set connection var ansible_shell_type to sh 29922 1726853678.98589: variable 'ansible_shell_executable' from source: unknown 29922 1726853678.98591: variable 'ansible_connection' from source: unknown 29922 1726853678.98593: variable 'ansible_module_compression' from source: unknown 29922 1726853678.98595: variable 'ansible_shell_type' from source: unknown 29922 1726853678.98596: variable 'ansible_shell_executable' from source: unknown 29922 1726853678.98598: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853678.98604: variable 'ansible_pipelining' from source: unknown 29922 1726853678.98606: variable 'ansible_timeout' from source: unknown 29922 1726853678.98608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853678.98877: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853678.98881: variable 'omit' from source: magic vars 29922 1726853678.98883: starting attempt loop 29922 1726853678.98886: running the handler 29922 1726853678.98888: variable 'ansible_facts' from source: unknown 29922 1726853679.00161: _low_level_execute_command(): starting 29922 1726853679.00175: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853679.00816: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853679.00829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853679.00843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853679.00860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853679.00880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853679.00891: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853679.00904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853679.00920: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853679.00931: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853679.00940: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29922 1726853679.01026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853679.01049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853679.01148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853679.02859: stdout chunk (state=3): >>>/root <<< 29922 1726853679.03016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853679.03019: stdout chunk (state=3): >>><<< 29922 1726853679.03022: stderr chunk (state=3): >>><<< 29922 1726853679.03039: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853679.03062: _low_level_execute_command(): starting 29922 1726853679.03074: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853679.030488-31223-53616016878985 `" && echo ansible-tmp-1726853679.030488-31223-53616016878985="` echo /root/.ansible/tmp/ansible-tmp-1726853679.030488-31223-53616016878985 `" ) && sleep 0' 29922 1726853679.04262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853679.04277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853679.04395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853679.04478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853679.06565: stdout chunk (state=3): >>>ansible-tmp-1726853679.030488-31223-53616016878985=/root/.ansible/tmp/ansible-tmp-1726853679.030488-31223-53616016878985 <<< 29922 1726853679.06577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853679.06602: stdout chunk (state=3): >>><<< 29922 1726853679.06610: stderr chunk (state=3): >>><<< 29922 1726853679.06775: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853679.030488-31223-53616016878985=/root/.ansible/tmp/ansible-tmp-1726853679.030488-31223-53616016878985 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853679.06779: variable 'ansible_module_compression' from source: unknown 29922 1726853679.06888: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 29922 1726853679.07077: variable 'ansible_facts' from source: unknown 29922 1726853679.07410: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853679.030488-31223-53616016878985/AnsiballZ_systemd.py 29922 1726853679.07801: Sending initial data 29922 1726853679.07804: Sent initial data (154 bytes) 29922 1726853679.08675: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853679.08789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853679.08792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853679.08809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853679.08825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853679.08912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853679.10583: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 29922 1726853679.10648: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853679.10813: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853679.11034: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp0dmbie7g /root/.ansible/tmp/ansible-tmp-1726853679.030488-31223-53616016878985/AnsiballZ_systemd.py <<< 29922 1726853679.11037: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853679.030488-31223-53616016878985/AnsiballZ_systemd.py" <<< 29922 1726853679.11079: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp0dmbie7g" to remote "/root/.ansible/tmp/ansible-tmp-1726853679.030488-31223-53616016878985/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853679.030488-31223-53616016878985/AnsiballZ_systemd.py" <<< 29922 1726853679.13110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853679.13189: stderr chunk (state=3): >>><<< 29922 1726853679.13203: stdout chunk (state=3): >>><<< 29922 1726853679.13273: done transferring module to remote 29922 1726853679.13362: _low_level_execute_command(): starting 29922 1726853679.13366: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853679.030488-31223-53616016878985/ /root/.ansible/tmp/ansible-tmp-1726853679.030488-31223-53616016878985/AnsiballZ_systemd.py && sleep 0' 29922 1726853679.13895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853679.13909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853679.13922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853679.13939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853679.13959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853679.13973: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853679.13990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853679.14008: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853679.14092: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853679.14111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853679.14126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853679.14147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853679.14239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853679.16138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853679.16141: stdout chunk (state=3): >>><<< 29922 1726853679.16149: stderr chunk (state=3): >>><<< 29922 1726853679.16164: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853679.16167: _low_level_execute_command(): starting 29922 1726853679.16173: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853679.030488-31223-53616016878985/AnsiballZ_systemd.py && sleep 0' 29922 1726853679.16958: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853679.16962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853679.16970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853679.16976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853679.16978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853679.17028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853679.46649: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10616832", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3320840192", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "2102076000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 29922 1726853679.46654: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysini<<< 29922 1726853679.46684: stdout chunk (state=3): >>>t.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 29922 1726853679.49089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853679.49121: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 29922 1726853679.49124: stdout chunk (state=3): >>><<< 29922 1726853679.49126: stderr chunk (state=3): >>><<< 29922 1726853679.49144: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10616832", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3320840192", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "2102076000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853679.49448: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853679.030488-31223-53616016878985/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853679.49451: _low_level_execute_command(): starting 29922 1726853679.49453: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853679.030488-31223-53616016878985/ > /dev/null 2>&1 && sleep 0' 29922 1726853679.50011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853679.50023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853679.50036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853679.50095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853679.50160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853679.50183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853679.50212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853679.50303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853679.52229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853679.52377: stderr chunk (state=3): >>><<< 29922 1726853679.52380: stdout chunk (state=3): >>><<< 29922 1726853679.52383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853679.52386: handler run complete 29922 1726853679.52388: attempt loop complete, returning result 29922 1726853679.52393: _execute() done 29922 1726853679.52395: dumping result to json 29922 1726853679.52416: done dumping result, returning 29922 1726853679.52425: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-51d4-513b-00000000007d] 29922 1726853679.52428: sending task result for task 02083763-bbaf-51d4-513b-00000000007d 29922 1726853679.52737: done sending task result for task 02083763-bbaf-51d4-513b-00000000007d 29922 1726853679.52740: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29922 1726853679.52786: no more pending results, returning what we have 29922 1726853679.52789: results queue empty 29922 1726853679.52790: checking for any_errors_fatal 29922 1726853679.52797: done checking for any_errors_fatal 29922 1726853679.52798: checking for max_fail_percentage 29922 1726853679.52800: done checking for max_fail_percentage 29922 1726853679.52800: checking to see if all hosts have failed and the running result is not ok 29922 1726853679.52801: done checking to see if all hosts have failed 29922 1726853679.52802: getting the remaining hosts for this loop 29922 1726853679.52803: done getting the remaining hosts for this loop 29922 1726853679.52807: getting the next task for host managed_node3 29922 1726853679.52812: done getting next task for host managed_node3 29922 1726853679.52814: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 29922 1726853679.52816: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853679.52826: getting variables 29922 1726853679.52827: in VariableManager get_vars() 29922 1726853679.52860: Calling all_inventory to load vars for managed_node3 29922 1726853679.52862: Calling groups_inventory to load vars for managed_node3 29922 1726853679.52864: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853679.52875: Calling all_plugins_play to load vars for managed_node3 29922 1726853679.52878: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853679.52881: Calling groups_plugins_play to load vars for managed_node3 29922 1726853679.54617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853679.56277: done with get_vars() 29922 1726853679.56308: done getting variables 29922 1726853679.56368: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:34:39 -0400 (0:00:00.726) 0:00:28.493 ****** 29922 1726853679.56402: entering _queue_task() for managed_node3/service 29922 1726853679.56761: worker is 1 (out of 1 available) 29922 1726853679.56775: exiting _queue_task() for managed_node3/service 29922 1726853679.56787: done queuing things up, now waiting for results queue to drain 29922 1726853679.56788: waiting for pending results... 29922 1726853679.57198: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 29922 1726853679.57241: in run() - task 02083763-bbaf-51d4-513b-00000000007e 29922 1726853679.57263: variable 'ansible_search_path' from source: unknown 29922 1726853679.57275: variable 'ansible_search_path' from source: unknown 29922 1726853679.57400: calling self._execute() 29922 1726853679.57446: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853679.57462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853679.57483: variable 'omit' from source: magic vars 29922 1726853679.57902: variable 'ansible_distribution_major_version' from source: facts 29922 1726853679.57921: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853679.58044: variable 'network_provider' from source: set_fact 29922 1726853679.58061: Evaluated conditional (network_provider == "nm"): True 29922 1726853679.58151: variable '__network_wpa_supplicant_required' from source: role '' defaults 29922 1726853679.58248: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29922 1726853679.58416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853679.60556: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853679.60653: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853679.60669: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853679.60709: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853679.60738: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853679.60836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853679.60978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853679.60982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853679.60984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853679.60987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853679.61008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853679.61035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853679.61062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853679.61109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853679.61127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853679.61168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853679.61200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853679.61226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853679.61266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853679.61288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853679.61442: variable 'network_connections' from source: play vars 29922 1726853679.61461: variable 'profile' from source: play vars 29922 1726853679.61542: variable 'profile' from source: play vars 29922 1726853679.61551: variable 'interface' from source: set_fact 29922 1726853679.61614: variable 'interface' from source: set_fact 29922 1726853679.61696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853679.61874: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853679.61953: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853679.61956: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853679.61986: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853679.62030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853679.62062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853679.62095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853679.62125: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853679.62277: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853679.62402: variable 'network_connections' from source: play vars 29922 1726853679.62414: variable 'profile' from source: play vars 29922 1726853679.62481: variable 'profile' from source: play vars 29922 1726853679.62495: variable 'interface' from source: set_fact 29922 1726853679.62557: variable 'interface' from source: set_fact 29922 1726853679.62598: Evaluated conditional (__network_wpa_supplicant_required): False 29922 1726853679.62611: when evaluation is False, skipping this task 29922 1726853679.62619: _execute() done 29922 1726853679.62636: dumping result to json 29922 1726853679.62645: done dumping result, returning 29922 1726853679.62656: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-51d4-513b-00000000007e] 29922 1726853679.62665: sending task result for task 02083763-bbaf-51d4-513b-00000000007e skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 29922 1726853679.62921: no more pending results, returning what we have 29922 1726853679.62925: results queue empty 29922 1726853679.62926: checking for any_errors_fatal 29922 1726853679.62951: done checking for any_errors_fatal 29922 1726853679.62952: checking for max_fail_percentage 29922 1726853679.62954: done checking for max_fail_percentage 29922 1726853679.62955: checking to see if all hosts have failed and the running result is not ok 29922 1726853679.62956: done checking to see if all hosts have failed 29922 1726853679.62957: getting the remaining hosts for this loop 29922 1726853679.62958: done getting the remaining hosts for this loop 29922 1726853679.62962: getting the next task for host managed_node3 29922 1726853679.62968: done getting next task for host managed_node3 29922 1726853679.62974: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 29922 1726853679.62976: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853679.62993: getting variables 29922 1726853679.62995: in VariableManager get_vars() 29922 1726853679.63040: Calling all_inventory to load vars for managed_node3 29922 1726853679.63043: Calling groups_inventory to load vars for managed_node3 29922 1726853679.63045: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853679.63057: Calling all_plugins_play to load vars for managed_node3 29922 1726853679.63060: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853679.63063: Calling groups_plugins_play to load vars for managed_node3 29922 1726853679.63584: done sending task result for task 02083763-bbaf-51d4-513b-00000000007e 29922 1726853679.63588: WORKER PROCESS EXITING 29922 1726853679.64668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853679.66291: done with get_vars() 29922 1726853679.66318: done getting variables 29922 1726853679.66379: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:34:39 -0400 (0:00:00.100) 0:00:28.593 ****** 29922 1726853679.66407: entering _queue_task() for managed_node3/service 29922 1726853679.66751: worker is 1 (out of 1 available) 29922 1726853679.66763: exiting _queue_task() for managed_node3/service 29922 1726853679.66979: done queuing things up, now waiting for results queue to drain 29922 1726853679.66981: waiting for pending results... 29922 1726853679.67061: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 29922 1726853679.67191: in run() - task 02083763-bbaf-51d4-513b-00000000007f 29922 1726853679.67217: variable 'ansible_search_path' from source: unknown 29922 1726853679.67225: variable 'ansible_search_path' from source: unknown 29922 1726853679.67268: calling self._execute() 29922 1726853679.67381: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853679.67394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853679.67408: variable 'omit' from source: magic vars 29922 1726853679.67785: variable 'ansible_distribution_major_version' from source: facts 29922 1726853679.67804: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853679.67935: variable 'network_provider' from source: set_fact 29922 1726853679.67948: Evaluated conditional (network_provider == "initscripts"): False 29922 1726853679.67955: when evaluation is False, skipping this task 29922 1726853679.67967: _execute() done 29922 1726853679.67977: dumping result to json 29922 1726853679.67985: done dumping result, returning 29922 1726853679.67996: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-51d4-513b-00000000007f] 29922 1726853679.68006: sending task result for task 02083763-bbaf-51d4-513b-00000000007f 29922 1726853679.68150: done sending task result for task 02083763-bbaf-51d4-513b-00000000007f 29922 1726853679.68154: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29922 1726853679.68229: no more pending results, returning what we have 29922 1726853679.68235: results queue empty 29922 1726853679.68236: checking for any_errors_fatal 29922 1726853679.68248: done checking for any_errors_fatal 29922 1726853679.68248: checking for max_fail_percentage 29922 1726853679.68250: done checking for max_fail_percentage 29922 1726853679.68251: checking to see if all hosts have failed and the running result is not ok 29922 1726853679.68252: done checking to see if all hosts have failed 29922 1726853679.68253: getting the remaining hosts for this loop 29922 1726853679.68254: done getting the remaining hosts for this loop 29922 1726853679.68258: getting the next task for host managed_node3 29922 1726853679.68265: done getting next task for host managed_node3 29922 1726853679.68269: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 29922 1726853679.68274: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853679.68292: getting variables 29922 1726853679.68294: in VariableManager get_vars() 29922 1726853679.68338: Calling all_inventory to load vars for managed_node3 29922 1726853679.68341: Calling groups_inventory to load vars for managed_node3 29922 1726853679.68343: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853679.68356: Calling all_plugins_play to load vars for managed_node3 29922 1726853679.68359: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853679.68363: Calling groups_plugins_play to load vars for managed_node3 29922 1726853679.70110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853679.71736: done with get_vars() 29922 1726853679.71766: done getting variables 29922 1726853679.71827: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:34:39 -0400 (0:00:00.054) 0:00:28.648 ****** 29922 1726853679.71858: entering _queue_task() for managed_node3/copy 29922 1726853679.72210: worker is 1 (out of 1 available) 29922 1726853679.72224: exiting _queue_task() for managed_node3/copy 29922 1726853679.72237: done queuing things up, now waiting for results queue to drain 29922 1726853679.72238: waiting for pending results... 29922 1726853679.72600: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 29922 1726853679.72697: in run() - task 02083763-bbaf-51d4-513b-000000000080 29922 1726853679.72702: variable 'ansible_search_path' from source: unknown 29922 1726853679.72704: variable 'ansible_search_path' from source: unknown 29922 1726853679.72727: calling self._execute() 29922 1726853679.72849: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853679.72861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853679.72878: variable 'omit' from source: magic vars 29922 1726853679.73477: variable 'ansible_distribution_major_version' from source: facts 29922 1726853679.73481: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853679.73484: variable 'network_provider' from source: set_fact 29922 1726853679.73486: Evaluated conditional (network_provider == "initscripts"): False 29922 1726853679.73488: when evaluation is False, skipping this task 29922 1726853679.73491: _execute() done 29922 1726853679.73493: dumping result to json 29922 1726853679.73495: done dumping result, returning 29922 1726853679.73498: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-51d4-513b-000000000080] 29922 1726853679.73500: sending task result for task 02083763-bbaf-51d4-513b-000000000080 29922 1726853679.73584: done sending task result for task 02083763-bbaf-51d4-513b-000000000080 29922 1726853679.73587: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 29922 1726853679.73641: no more pending results, returning what we have 29922 1726853679.73646: results queue empty 29922 1726853679.73647: checking for any_errors_fatal 29922 1726853679.73656: done checking for any_errors_fatal 29922 1726853679.73657: checking for max_fail_percentage 29922 1726853679.73659: done checking for max_fail_percentage 29922 1726853679.73660: checking to see if all hosts have failed and the running result is not ok 29922 1726853679.73661: done checking to see if all hosts have failed 29922 1726853679.73662: getting the remaining hosts for this loop 29922 1726853679.73663: done getting the remaining hosts for this loop 29922 1726853679.73667: getting the next task for host managed_node3 29922 1726853679.73677: done getting next task for host managed_node3 29922 1726853679.73682: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 29922 1726853679.73684: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853679.73702: getting variables 29922 1726853679.73704: in VariableManager get_vars() 29922 1726853679.73750: Calling all_inventory to load vars for managed_node3 29922 1726853679.73753: Calling groups_inventory to load vars for managed_node3 29922 1726853679.73756: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853679.73769: Calling all_plugins_play to load vars for managed_node3 29922 1726853679.73978: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853679.73983: Calling groups_plugins_play to load vars for managed_node3 29922 1726853679.75425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853679.77083: done with get_vars() 29922 1726853679.77107: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:34:39 -0400 (0:00:00.053) 0:00:28.701 ****** 29922 1726853679.77188: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 29922 1726853679.77534: worker is 1 (out of 1 available) 29922 1726853679.77548: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 29922 1726853679.77560: done queuing things up, now waiting for results queue to drain 29922 1726853679.77561: waiting for pending results... 29922 1726853679.77842: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 29922 1726853679.77996: in run() - task 02083763-bbaf-51d4-513b-000000000081 29922 1726853679.78000: variable 'ansible_search_path' from source: unknown 29922 1726853679.78003: variable 'ansible_search_path' from source: unknown 29922 1726853679.78026: calling self._execute() 29922 1726853679.78377: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853679.78380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853679.78383: variable 'omit' from source: magic vars 29922 1726853679.78542: variable 'ansible_distribution_major_version' from source: facts 29922 1726853679.78559: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853679.78575: variable 'omit' from source: magic vars 29922 1726853679.78623: variable 'omit' from source: magic vars 29922 1726853679.78792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853679.80848: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853679.80925: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853679.80964: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853679.81009: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853679.81041: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853679.81129: variable 'network_provider' from source: set_fact 29922 1726853679.81267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853679.81318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853679.81352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853679.81435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853679.81439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853679.81497: variable 'omit' from source: magic vars 29922 1726853679.81614: variable 'omit' from source: magic vars 29922 1726853679.81724: variable 'network_connections' from source: play vars 29922 1726853679.81742: variable 'profile' from source: play vars 29922 1726853679.81814: variable 'profile' from source: play vars 29922 1726853679.81869: variable 'interface' from source: set_fact 29922 1726853679.81889: variable 'interface' from source: set_fact 29922 1726853679.82028: variable 'omit' from source: magic vars 29922 1726853679.82040: variable '__lsr_ansible_managed' from source: task vars 29922 1726853679.82107: variable '__lsr_ansible_managed' from source: task vars 29922 1726853679.82386: Loaded config def from plugin (lookup/template) 29922 1726853679.82394: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 29922 1726853679.82424: File lookup term: get_ansible_managed.j2 29922 1726853679.82431: variable 'ansible_search_path' from source: unknown 29922 1726853679.82520: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 29922 1726853679.82525: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 29922 1726853679.82527: variable 'ansible_search_path' from source: unknown 29922 1726853679.88914: variable 'ansible_managed' from source: unknown 29922 1726853679.89042: variable 'omit' from source: magic vars 29922 1726853679.89079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853679.89110: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853679.89132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853679.89155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853679.89176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853679.89211: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853679.89219: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853679.89270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853679.89326: Set connection var ansible_connection to ssh 29922 1726853679.89338: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853679.89350: Set connection var ansible_shell_executable to /bin/sh 29922 1726853679.89361: Set connection var ansible_pipelining to False 29922 1726853679.89370: Set connection var ansible_timeout to 10 29922 1726853679.89382: Set connection var ansible_shell_type to sh 29922 1726853679.89411: variable 'ansible_shell_executable' from source: unknown 29922 1726853679.89418: variable 'ansible_connection' from source: unknown 29922 1726853679.89425: variable 'ansible_module_compression' from source: unknown 29922 1726853679.89578: variable 'ansible_shell_type' from source: unknown 29922 1726853679.89581: variable 'ansible_shell_executable' from source: unknown 29922 1726853679.89583: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853679.89585: variable 'ansible_pipelining' from source: unknown 29922 1726853679.89588: variable 'ansible_timeout' from source: unknown 29922 1726853679.89589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853679.89592: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853679.89611: variable 'omit' from source: magic vars 29922 1726853679.89621: starting attempt loop 29922 1726853679.89628: running the handler 29922 1726853679.89644: _low_level_execute_command(): starting 29922 1726853679.89654: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853679.90390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853679.90491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853679.90505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853679.90525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853679.90620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853679.92335: stdout chunk (state=3): >>>/root <<< 29922 1726853679.92494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853679.92498: stdout chunk (state=3): >>><<< 29922 1726853679.92500: stderr chunk (state=3): >>><<< 29922 1726853679.92521: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853679.92542: _low_level_execute_command(): starting 29922 1726853679.92623: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853679.925281-31282-266810857181185 `" && echo ansible-tmp-1726853679.925281-31282-266810857181185="` echo /root/.ansible/tmp/ansible-tmp-1726853679.925281-31282-266810857181185 `" ) && sleep 0' 29922 1726853679.93149: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853679.93165: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853679.93185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853679.93203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853679.93217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853679.93227: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853679.93241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853679.93256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853679.93266: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853679.93287: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853679.93364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853679.93384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853679.93397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853679.93488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853679.95469: stdout chunk (state=3): >>>ansible-tmp-1726853679.925281-31282-266810857181185=/root/.ansible/tmp/ansible-tmp-1726853679.925281-31282-266810857181185 <<< 29922 1726853679.95634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853679.95637: stdout chunk (state=3): >>><<< 29922 1726853679.95640: stderr chunk (state=3): >>><<< 29922 1726853679.95656: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853679.925281-31282-266810857181185=/root/.ansible/tmp/ansible-tmp-1726853679.925281-31282-266810857181185 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853679.95876: variable 'ansible_module_compression' from source: unknown 29922 1726853679.95880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 29922 1726853679.95882: variable 'ansible_facts' from source: unknown 29922 1726853679.95914: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853679.925281-31282-266810857181185/AnsiballZ_network_connections.py 29922 1726853679.96127: Sending initial data 29922 1726853679.96131: Sent initial data (167 bytes) 29922 1726853679.96700: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853679.96768: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853679.96832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853679.96847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853679.96874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853679.96961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853679.98622: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853679.98709: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853679.98768: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpxeb8_nfv /root/.ansible/tmp/ansible-tmp-1726853679.925281-31282-266810857181185/AnsiballZ_network_connections.py <<< 29922 1726853679.98780: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853679.925281-31282-266810857181185/AnsiballZ_network_connections.py" <<< 29922 1726853679.98816: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpxeb8_nfv" to remote "/root/.ansible/tmp/ansible-tmp-1726853679.925281-31282-266810857181185/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853679.925281-31282-266810857181185/AnsiballZ_network_connections.py" <<< 29922 1726853680.00062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853680.00065: stdout chunk (state=3): >>><<< 29922 1726853680.00068: stderr chunk (state=3): >>><<< 29922 1726853680.00070: done transferring module to remote 29922 1726853680.00079: _low_level_execute_command(): starting 29922 1726853680.00082: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853679.925281-31282-266810857181185/ /root/.ansible/tmp/ansible-tmp-1726853679.925281-31282-266810857181185/AnsiballZ_network_connections.py && sleep 0' 29922 1726853680.00696: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853680.00710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853680.00722: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853680.00786: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853680.00816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853680.00848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853680.01012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853680.02927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853680.02931: stdout chunk (state=3): >>><<< 29922 1726853680.03008: stderr chunk (state=3): >>><<< 29922 1726853680.03202: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853680.03205: _low_level_execute_command(): starting 29922 1726853680.03288: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853679.925281-31282-266810857181185/AnsiballZ_network_connections.py && sleep 0' 29922 1726853680.04865: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853680.05116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853680.05194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853680.37421: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 29922 1726853680.39203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853680.39266: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 29922 1726853680.39284: stdout chunk (state=3): >>><<< 29922 1726853680.39301: stderr chunk (state=3): >>><<< 29922 1726853680.39328: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853680.39369: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853679.925281-31282-266810857181185/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853680.39385: _low_level_execute_command(): starting 29922 1726853680.39396: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853679.925281-31282-266810857181185/ > /dev/null 2>&1 && sleep 0' 29922 1726853680.40031: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853680.40067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853680.40088: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853680.40176: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853680.40202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853680.40305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853680.42229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853680.42243: stdout chunk (state=3): >>><<< 29922 1726853680.42259: stderr chunk (state=3): >>><<< 29922 1726853680.42282: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853680.42292: handler run complete 29922 1726853680.42321: attempt loop complete, returning result 29922 1726853680.42477: _execute() done 29922 1726853680.42480: dumping result to json 29922 1726853680.42482: done dumping result, returning 29922 1726853680.42485: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-51d4-513b-000000000081] 29922 1726853680.42487: sending task result for task 02083763-bbaf-51d4-513b-000000000081 29922 1726853680.42562: done sending task result for task 02083763-bbaf-51d4-513b-000000000081 29922 1726853680.42566: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 29922 1726853680.42660: no more pending results, returning what we have 29922 1726853680.42663: results queue empty 29922 1726853680.42664: checking for any_errors_fatal 29922 1726853680.42677: done checking for any_errors_fatal 29922 1726853680.42678: checking for max_fail_percentage 29922 1726853680.42680: done checking for max_fail_percentage 29922 1726853680.42681: checking to see if all hosts have failed and the running result is not ok 29922 1726853680.42682: done checking to see if all hosts have failed 29922 1726853680.42683: getting the remaining hosts for this loop 29922 1726853680.42684: done getting the remaining hosts for this loop 29922 1726853680.42688: getting the next task for host managed_node3 29922 1726853680.42694: done getting next task for host managed_node3 29922 1726853680.42697: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 29922 1726853680.42704: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853680.42715: getting variables 29922 1726853680.42717: in VariableManager get_vars() 29922 1726853680.42754: Calling all_inventory to load vars for managed_node3 29922 1726853680.42757: Calling groups_inventory to load vars for managed_node3 29922 1726853680.42759: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853680.42770: Calling all_plugins_play to load vars for managed_node3 29922 1726853680.43020: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853680.43025: Calling groups_plugins_play to load vars for managed_node3 29922 1726853680.44606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853680.47402: done with get_vars() 29922 1726853680.47497: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:34:40 -0400 (0:00:00.703) 0:00:29.405 ****** 29922 1726853680.47585: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 29922 1726853680.48026: worker is 1 (out of 1 available) 29922 1726853680.48060: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 29922 1726853680.48080: done queuing things up, now waiting for results queue to drain 29922 1726853680.48082: waiting for pending results... 29922 1726853680.48289: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 29922 1726853680.48469: in run() - task 02083763-bbaf-51d4-513b-000000000082 29922 1726853680.48476: variable 'ansible_search_path' from source: unknown 29922 1726853680.48479: variable 'ansible_search_path' from source: unknown 29922 1726853680.48499: calling self._execute() 29922 1726853680.48518: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853680.48524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853680.48535: variable 'omit' from source: magic vars 29922 1726853680.48936: variable 'ansible_distribution_major_version' from source: facts 29922 1726853680.48948: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853680.49122: variable 'network_state' from source: role '' defaults 29922 1726853680.49126: Evaluated conditional (network_state != {}): False 29922 1726853680.49128: when evaluation is False, skipping this task 29922 1726853680.49131: _execute() done 29922 1726853680.49134: dumping result to json 29922 1726853680.49136: done dumping result, returning 29922 1726853680.49138: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-51d4-513b-000000000082] 29922 1726853680.49141: sending task result for task 02083763-bbaf-51d4-513b-000000000082 29922 1726853680.49223: done sending task result for task 02083763-bbaf-51d4-513b-000000000082 29922 1726853680.49228: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29922 1726853680.49396: no more pending results, returning what we have 29922 1726853680.49399: results queue empty 29922 1726853680.49400: checking for any_errors_fatal 29922 1726853680.49409: done checking for any_errors_fatal 29922 1726853680.49409: checking for max_fail_percentage 29922 1726853680.49411: done checking for max_fail_percentage 29922 1726853680.49412: checking to see if all hosts have failed and the running result is not ok 29922 1726853680.49413: done checking to see if all hosts have failed 29922 1726853680.49414: getting the remaining hosts for this loop 29922 1726853680.49414: done getting the remaining hosts for this loop 29922 1726853680.49417: getting the next task for host managed_node3 29922 1726853680.49421: done getting next task for host managed_node3 29922 1726853680.49425: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 29922 1726853680.49427: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853680.49441: getting variables 29922 1726853680.49442: in VariableManager get_vars() 29922 1726853680.49481: Calling all_inventory to load vars for managed_node3 29922 1726853680.49484: Calling groups_inventory to load vars for managed_node3 29922 1726853680.49487: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853680.49649: Calling all_plugins_play to load vars for managed_node3 29922 1726853680.49653: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853680.49659: Calling groups_plugins_play to load vars for managed_node3 29922 1726853680.54405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853680.57945: done with get_vars() 29922 1726853680.58085: done getting variables 29922 1726853680.58148: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:34:40 -0400 (0:00:00.105) 0:00:29.511 ****** 29922 1726853680.58190: entering _queue_task() for managed_node3/debug 29922 1726853680.59190: worker is 1 (out of 1 available) 29922 1726853680.59204: exiting _queue_task() for managed_node3/debug 29922 1726853680.59216: done queuing things up, now waiting for results queue to drain 29922 1726853680.59218: waiting for pending results... 29922 1726853680.59892: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 29922 1726853680.60123: in run() - task 02083763-bbaf-51d4-513b-000000000083 29922 1726853680.60127: variable 'ansible_search_path' from source: unknown 29922 1726853680.60130: variable 'ansible_search_path' from source: unknown 29922 1726853680.60132: calling self._execute() 29922 1726853680.60247: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853680.60269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853680.60294: variable 'omit' from source: magic vars 29922 1726853680.60842: variable 'ansible_distribution_major_version' from source: facts 29922 1726853680.60880: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853680.60893: variable 'omit' from source: magic vars 29922 1726853680.60938: variable 'omit' from source: magic vars 29922 1726853680.61002: variable 'omit' from source: magic vars 29922 1726853680.61048: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853680.61105: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853680.61141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853680.61178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853680.61281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853680.61288: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853680.61306: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853680.61317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853680.61491: Set connection var ansible_connection to ssh 29922 1726853680.61505: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853680.61525: Set connection var ansible_shell_executable to /bin/sh 29922 1726853680.61577: Set connection var ansible_pipelining to False 29922 1726853680.61581: Set connection var ansible_timeout to 10 29922 1726853680.61584: Set connection var ansible_shell_type to sh 29922 1726853680.61595: variable 'ansible_shell_executable' from source: unknown 29922 1726853680.61604: variable 'ansible_connection' from source: unknown 29922 1726853680.61612: variable 'ansible_module_compression' from source: unknown 29922 1726853680.61620: variable 'ansible_shell_type' from source: unknown 29922 1726853680.61634: variable 'ansible_shell_executable' from source: unknown 29922 1726853680.61748: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853680.61754: variable 'ansible_pipelining' from source: unknown 29922 1726853680.61762: variable 'ansible_timeout' from source: unknown 29922 1726853680.61766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853680.61883: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853680.61900: variable 'omit' from source: magic vars 29922 1726853680.61910: starting attempt loop 29922 1726853680.61924: running the handler 29922 1726853680.62094: variable '__network_connections_result' from source: set_fact 29922 1726853680.62176: handler run complete 29922 1726853680.62183: attempt loop complete, returning result 29922 1726853680.62186: _execute() done 29922 1726853680.62189: dumping result to json 29922 1726853680.62200: done dumping result, returning 29922 1726853680.62215: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-51d4-513b-000000000083] 29922 1726853680.62279: sending task result for task 02083763-bbaf-51d4-513b-000000000083 29922 1726853680.62567: done sending task result for task 02083763-bbaf-51d4-513b-000000000083 29922 1726853680.62570: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 29922 1726853680.62633: no more pending results, returning what we have 29922 1726853680.62636: results queue empty 29922 1726853680.62637: checking for any_errors_fatal 29922 1726853680.62644: done checking for any_errors_fatal 29922 1726853680.62645: checking for max_fail_percentage 29922 1726853680.62646: done checking for max_fail_percentage 29922 1726853680.62647: checking to see if all hosts have failed and the running result is not ok 29922 1726853680.62648: done checking to see if all hosts have failed 29922 1726853680.62649: getting the remaining hosts for this loop 29922 1726853680.62650: done getting the remaining hosts for this loop 29922 1726853680.62654: getting the next task for host managed_node3 29922 1726853680.62661: done getting next task for host managed_node3 29922 1726853680.62665: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 29922 1726853680.62668: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853680.62680: getting variables 29922 1726853680.62682: in VariableManager get_vars() 29922 1726853680.62719: Calling all_inventory to load vars for managed_node3 29922 1726853680.62722: Calling groups_inventory to load vars for managed_node3 29922 1726853680.62724: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853680.62734: Calling all_plugins_play to load vars for managed_node3 29922 1726853680.62737: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853680.62740: Calling groups_plugins_play to load vars for managed_node3 29922 1726853680.64715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853680.68202: done with get_vars() 29922 1726853680.68247: done getting variables 29922 1726853680.68319: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:34:40 -0400 (0:00:00.101) 0:00:29.613 ****** 29922 1726853680.68580: entering _queue_task() for managed_node3/debug 29922 1726853680.69506: worker is 1 (out of 1 available) 29922 1726853680.69524: exiting _queue_task() for managed_node3/debug 29922 1726853680.69584: done queuing things up, now waiting for results queue to drain 29922 1726853680.69587: waiting for pending results... 29922 1726853680.70113: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 29922 1726853680.70429: in run() - task 02083763-bbaf-51d4-513b-000000000084 29922 1726853680.70445: variable 'ansible_search_path' from source: unknown 29922 1726853680.70472: variable 'ansible_search_path' from source: unknown 29922 1726853680.70500: calling self._execute() 29922 1726853680.70826: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853680.70834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853680.70842: variable 'omit' from source: magic vars 29922 1726853680.71427: variable 'ansible_distribution_major_version' from source: facts 29922 1726853680.71451: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853680.71457: variable 'omit' from source: magic vars 29922 1726853680.71625: variable 'omit' from source: magic vars 29922 1726853680.71662: variable 'omit' from source: magic vars 29922 1726853680.71702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853680.71737: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853680.71760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853680.71774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853680.71992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853680.72023: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853680.72026: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853680.72029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853680.72125: Set connection var ansible_connection to ssh 29922 1726853680.72165: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853680.72168: Set connection var ansible_shell_executable to /bin/sh 29922 1726853680.72172: Set connection var ansible_pipelining to False 29922 1726853680.72175: Set connection var ansible_timeout to 10 29922 1726853680.72177: Set connection var ansible_shell_type to sh 29922 1726853680.72385: variable 'ansible_shell_executable' from source: unknown 29922 1726853680.72388: variable 'ansible_connection' from source: unknown 29922 1726853680.72391: variable 'ansible_module_compression' from source: unknown 29922 1726853680.72394: variable 'ansible_shell_type' from source: unknown 29922 1726853680.72396: variable 'ansible_shell_executable' from source: unknown 29922 1726853680.72398: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853680.72400: variable 'ansible_pipelining' from source: unknown 29922 1726853680.72402: variable 'ansible_timeout' from source: unknown 29922 1726853680.72405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853680.72566: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853680.72570: variable 'omit' from source: magic vars 29922 1726853680.72576: starting attempt loop 29922 1726853680.72578: running the handler 29922 1726853680.72707: variable '__network_connections_result' from source: set_fact 29922 1726853680.72826: variable '__network_connections_result' from source: set_fact 29922 1726853680.72922: handler run complete 29922 1726853680.72963: attempt loop complete, returning result 29922 1726853680.72973: _execute() done 29922 1726853680.72979: dumping result to json 29922 1726853680.72987: done dumping result, returning 29922 1726853680.72998: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-51d4-513b-000000000084] 29922 1726853680.73007: sending task result for task 02083763-bbaf-51d4-513b-000000000084 ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 29922 1726853680.73241: no more pending results, returning what we have 29922 1726853680.73245: results queue empty 29922 1726853680.73246: checking for any_errors_fatal 29922 1726853680.73259: done checking for any_errors_fatal 29922 1726853680.73260: checking for max_fail_percentage 29922 1726853680.73261: done checking for max_fail_percentage 29922 1726853680.73263: checking to see if all hosts have failed and the running result is not ok 29922 1726853680.73263: done checking to see if all hosts have failed 29922 1726853680.73264: getting the remaining hosts for this loop 29922 1726853680.73266: done getting the remaining hosts for this loop 29922 1726853680.73277: getting the next task for host managed_node3 29922 1726853680.73284: done getting next task for host managed_node3 29922 1726853680.73288: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 29922 1726853680.73291: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853680.73304: getting variables 29922 1726853680.73305: in VariableManager get_vars() 29922 1726853680.73345: Calling all_inventory to load vars for managed_node3 29922 1726853680.73348: Calling groups_inventory to load vars for managed_node3 29922 1726853680.73350: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853680.73365: Calling all_plugins_play to load vars for managed_node3 29922 1726853680.73369: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853680.73504: done sending task result for task 02083763-bbaf-51d4-513b-000000000084 29922 1726853680.73507: WORKER PROCESS EXITING 29922 1726853680.73726: Calling groups_plugins_play to load vars for managed_node3 29922 1726853680.75941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853680.79231: done with get_vars() 29922 1726853680.79381: done getting variables 29922 1726853680.79445: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:34:40 -0400 (0:00:00.112) 0:00:29.725 ****** 29922 1726853680.79609: entering _queue_task() for managed_node3/debug 29922 1726853680.80540: worker is 1 (out of 1 available) 29922 1726853680.80554: exiting _queue_task() for managed_node3/debug 29922 1726853680.80568: done queuing things up, now waiting for results queue to drain 29922 1726853680.80569: waiting for pending results... 29922 1726853680.81013: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 29922 1726853680.81308: in run() - task 02083763-bbaf-51d4-513b-000000000085 29922 1726853680.81313: variable 'ansible_search_path' from source: unknown 29922 1726853680.81316: variable 'ansible_search_path' from source: unknown 29922 1726853680.81468: calling self._execute() 29922 1726853680.81691: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853680.81695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853680.81707: variable 'omit' from source: magic vars 29922 1726853680.82492: variable 'ansible_distribution_major_version' from source: facts 29922 1726853680.82504: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853680.82739: variable 'network_state' from source: role '' defaults 29922 1726853680.82750: Evaluated conditional (network_state != {}): False 29922 1726853680.82753: when evaluation is False, skipping this task 29922 1726853680.82876: _execute() done 29922 1726853680.82880: dumping result to json 29922 1726853680.82883: done dumping result, returning 29922 1726853680.82891: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-51d4-513b-000000000085] 29922 1726853680.82897: sending task result for task 02083763-bbaf-51d4-513b-000000000085 29922 1726853680.82991: done sending task result for task 02083763-bbaf-51d4-513b-000000000085 29922 1726853680.82995: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 29922 1726853680.83046: no more pending results, returning what we have 29922 1726853680.83051: results queue empty 29922 1726853680.83052: checking for any_errors_fatal 29922 1726853680.83060: done checking for any_errors_fatal 29922 1726853680.83061: checking for max_fail_percentage 29922 1726853680.83063: done checking for max_fail_percentage 29922 1726853680.83064: checking to see if all hosts have failed and the running result is not ok 29922 1726853680.83065: done checking to see if all hosts have failed 29922 1726853680.83065: getting the remaining hosts for this loop 29922 1726853680.83067: done getting the remaining hosts for this loop 29922 1726853680.83070: getting the next task for host managed_node3 29922 1726853680.83084: done getting next task for host managed_node3 29922 1726853680.83088: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 29922 1726853680.83092: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853680.83109: getting variables 29922 1726853680.83111: in VariableManager get_vars() 29922 1726853680.83152: Calling all_inventory to load vars for managed_node3 29922 1726853680.83155: Calling groups_inventory to load vars for managed_node3 29922 1726853680.83158: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853680.83374: Calling all_plugins_play to load vars for managed_node3 29922 1726853680.83380: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853680.83384: Calling groups_plugins_play to load vars for managed_node3 29922 1726853680.86626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853680.90254: done with get_vars() 29922 1726853680.90385: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:34:40 -0400 (0:00:00.108) 0:00:29.834 ****** 29922 1726853680.90489: entering _queue_task() for managed_node3/ping 29922 1726853680.91269: worker is 1 (out of 1 available) 29922 1726853680.91284: exiting _queue_task() for managed_node3/ping 29922 1726853680.91519: done queuing things up, now waiting for results queue to drain 29922 1726853680.91521: waiting for pending results... 29922 1726853680.92192: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 29922 1726853680.92198: in run() - task 02083763-bbaf-51d4-513b-000000000086 29922 1726853680.92201: variable 'ansible_search_path' from source: unknown 29922 1726853680.92204: variable 'ansible_search_path' from source: unknown 29922 1726853680.92229: calling self._execute() 29922 1726853680.92453: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853680.92459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853680.92478: variable 'omit' from source: magic vars 29922 1726853680.93359: variable 'ansible_distribution_major_version' from source: facts 29922 1726853680.93368: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853680.93607: variable 'omit' from source: magic vars 29922 1726853680.93611: variable 'omit' from source: magic vars 29922 1726853680.93634: variable 'omit' from source: magic vars 29922 1726853680.93677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853680.93792: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853680.93854: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853680.93875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853680.93888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853680.94033: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853680.94038: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853680.94041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853680.94259: Set connection var ansible_connection to ssh 29922 1726853680.94262: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853680.94585: Set connection var ansible_shell_executable to /bin/sh 29922 1726853680.94589: Set connection var ansible_pipelining to False 29922 1726853680.94591: Set connection var ansible_timeout to 10 29922 1726853680.94594: Set connection var ansible_shell_type to sh 29922 1726853680.94596: variable 'ansible_shell_executable' from source: unknown 29922 1726853680.94598: variable 'ansible_connection' from source: unknown 29922 1726853680.94601: variable 'ansible_module_compression' from source: unknown 29922 1726853680.94603: variable 'ansible_shell_type' from source: unknown 29922 1726853680.94605: variable 'ansible_shell_executable' from source: unknown 29922 1726853680.94607: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853680.94609: variable 'ansible_pipelining' from source: unknown 29922 1726853680.94612: variable 'ansible_timeout' from source: unknown 29922 1726853680.94614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853680.94885: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853680.94901: variable 'omit' from source: magic vars 29922 1726853680.94906: starting attempt loop 29922 1726853680.94909: running the handler 29922 1726853680.94924: _low_level_execute_command(): starting 29922 1726853680.94937: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853680.96579: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853680.96585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853680.96659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853680.96998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853680.97095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853680.98817: stdout chunk (state=3): >>>/root <<< 29922 1726853680.99062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853680.99066: stdout chunk (state=3): >>><<< 29922 1726853680.99074: stderr chunk (state=3): >>><<< 29922 1726853680.99097: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853680.99114: _low_level_execute_command(): starting 29922 1726853680.99120: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853680.9909759-31337-84417281110810 `" && echo ansible-tmp-1726853680.9909759-31337-84417281110810="` echo /root/.ansible/tmp/ansible-tmp-1726853680.9909759-31337-84417281110810 `" ) && sleep 0' 29922 1726853681.00430: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853681.00434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853681.00437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853681.00440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853681.00442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853681.00576: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853681.00580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853681.00591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853681.00594: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853681.00596: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29922 1726853681.00598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853681.00600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853681.00602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853681.00605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853681.00734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853681.00970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853681.02813: stdout chunk (state=3): >>>ansible-tmp-1726853680.9909759-31337-84417281110810=/root/.ansible/tmp/ansible-tmp-1726853680.9909759-31337-84417281110810 <<< 29922 1726853681.02989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853681.03064: stderr chunk (state=3): >>><<< 29922 1726853681.03067: stdout chunk (state=3): >>><<< 29922 1726853681.03205: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853680.9909759-31337-84417281110810=/root/.ansible/tmp/ansible-tmp-1726853680.9909759-31337-84417281110810 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853681.03222: variable 'ansible_module_compression' from source: unknown 29922 1726853681.03378: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 29922 1726853681.03420: variable 'ansible_facts' from source: unknown 29922 1726853681.03645: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853680.9909759-31337-84417281110810/AnsiballZ_ping.py 29922 1726853681.04078: Sending initial data 29922 1726853681.04081: Sent initial data (152 bytes) 29922 1726853681.05292: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853681.05454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853681.05461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853681.05463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853681.05702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853681.07370: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853681.07485: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853681.07490: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpt8ux8uef /root/.ansible/tmp/ansible-tmp-1726853680.9909759-31337-84417281110810/AnsiballZ_ping.py <<< 29922 1726853681.07492: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853680.9909759-31337-84417281110810/AnsiballZ_ping.py" <<< 29922 1726853681.07577: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpt8ux8uef" to remote "/root/.ansible/tmp/ansible-tmp-1726853680.9909759-31337-84417281110810/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853680.9909759-31337-84417281110810/AnsiballZ_ping.py" <<< 29922 1726853681.09379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853681.09383: stderr chunk (state=3): >>><<< 29922 1726853681.09385: stdout chunk (state=3): >>><<< 29922 1726853681.09387: done transferring module to remote 29922 1726853681.09390: _low_level_execute_command(): starting 29922 1726853681.09392: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853680.9909759-31337-84417281110810/ /root/.ansible/tmp/ansible-tmp-1726853680.9909759-31337-84417281110810/AnsiballZ_ping.py && sleep 0' 29922 1726853681.10630: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853681.10678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853681.10965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853681.10976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853681.10979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853681.12830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853681.13057: stderr chunk (state=3): >>><<< 29922 1726853681.13061: stdout chunk (state=3): >>><<< 29922 1726853681.13064: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853681.13066: _low_level_execute_command(): starting 29922 1726853681.13068: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853680.9909759-31337-84417281110810/AnsiballZ_ping.py && sleep 0' 29922 1726853681.14192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853681.14293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853681.14313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853681.14327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853681.14486: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853681.14498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853681.14513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853681.14525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853681.14788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853681.30249: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 29922 1726853681.31646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853681.31650: stdout chunk (state=3): >>><<< 29922 1726853681.31657: stderr chunk (state=3): >>><<< 29922 1726853681.31730: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853681.31761: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853680.9909759-31337-84417281110810/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853681.31768: _low_level_execute_command(): starting 29922 1726853681.31773: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853680.9909759-31337-84417281110810/ > /dev/null 2>&1 && sleep 0' 29922 1726853681.33439: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853681.33460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853681.33482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853681.33499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853681.33609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853681.33744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853681.33785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853681.33884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853681.35828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853681.35889: stdout chunk (state=3): >>><<< 29922 1726853681.35901: stderr chunk (state=3): >>><<< 29922 1726853681.35923: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853681.35943: handler run complete 29922 1726853681.36006: attempt loop complete, returning result 29922 1726853681.36066: _execute() done 29922 1726853681.36214: dumping result to json 29922 1726853681.36480: done dumping result, returning 29922 1726853681.36483: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-51d4-513b-000000000086] 29922 1726853681.36486: sending task result for task 02083763-bbaf-51d4-513b-000000000086 29922 1726853681.36558: done sending task result for task 02083763-bbaf-51d4-513b-000000000086 29922 1726853681.36561: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 29922 1726853681.36642: no more pending results, returning what we have 29922 1726853681.36646: results queue empty 29922 1726853681.36647: checking for any_errors_fatal 29922 1726853681.36654: done checking for any_errors_fatal 29922 1726853681.36657: checking for max_fail_percentage 29922 1726853681.36660: done checking for max_fail_percentage 29922 1726853681.36661: checking to see if all hosts have failed and the running result is not ok 29922 1726853681.36662: done checking to see if all hosts have failed 29922 1726853681.36662: getting the remaining hosts for this loop 29922 1726853681.36664: done getting the remaining hosts for this loop 29922 1726853681.36667: getting the next task for host managed_node3 29922 1726853681.36677: done getting next task for host managed_node3 29922 1726853681.36680: ^ task is: TASK: meta (role_complete) 29922 1726853681.36682: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853681.36781: getting variables 29922 1726853681.36783: in VariableManager get_vars() 29922 1726853681.36829: Calling all_inventory to load vars for managed_node3 29922 1726853681.36831: Calling groups_inventory to load vars for managed_node3 29922 1726853681.36834: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853681.36846: Calling all_plugins_play to load vars for managed_node3 29922 1726853681.36849: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853681.36852: Calling groups_plugins_play to load vars for managed_node3 29922 1726853681.40284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853681.43816: done with get_vars() 29922 1726853681.43963: done getting variables 29922 1726853681.44048: done queuing things up, now waiting for results queue to drain 29922 1726853681.44050: results queue empty 29922 1726853681.44051: checking for any_errors_fatal 29922 1726853681.44054: done checking for any_errors_fatal 29922 1726853681.44057: checking for max_fail_percentage 29922 1726853681.44058: done checking for max_fail_percentage 29922 1726853681.44059: checking to see if all hosts have failed and the running result is not ok 29922 1726853681.44060: done checking to see if all hosts have failed 29922 1726853681.44061: getting the remaining hosts for this loop 29922 1726853681.44061: done getting the remaining hosts for this loop 29922 1726853681.44064: getting the next task for host managed_node3 29922 1726853681.44067: done getting next task for host managed_node3 29922 1726853681.44069: ^ task is: TASK: meta (flush_handlers) 29922 1726853681.44147: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853681.44151: getting variables 29922 1726853681.44152: in VariableManager get_vars() 29922 1726853681.44168: Calling all_inventory to load vars for managed_node3 29922 1726853681.44172: Calling groups_inventory to load vars for managed_node3 29922 1726853681.44174: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853681.44184: Calling all_plugins_play to load vars for managed_node3 29922 1726853681.44186: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853681.44189: Calling groups_plugins_play to load vars for managed_node3 29922 1726853681.45973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853681.47808: done with get_vars() 29922 1726853681.47837: done getting variables 29922 1726853681.47898: in VariableManager get_vars() 29922 1726853681.47912: Calling all_inventory to load vars for managed_node3 29922 1726853681.47914: Calling groups_inventory to load vars for managed_node3 29922 1726853681.47916: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853681.47921: Calling all_plugins_play to load vars for managed_node3 29922 1726853681.47923: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853681.47926: Calling groups_plugins_play to load vars for managed_node3 29922 1726853681.49557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853681.51433: done with get_vars() 29922 1726853681.51481: done queuing things up, now waiting for results queue to drain 29922 1726853681.51483: results queue empty 29922 1726853681.51484: checking for any_errors_fatal 29922 1726853681.51485: done checking for any_errors_fatal 29922 1726853681.51486: checking for max_fail_percentage 29922 1726853681.51488: done checking for max_fail_percentage 29922 1726853681.51488: checking to see if all hosts have failed and the running result is not ok 29922 1726853681.51489: done checking to see if all hosts have failed 29922 1726853681.51490: getting the remaining hosts for this loop 29922 1726853681.51491: done getting the remaining hosts for this loop 29922 1726853681.51494: getting the next task for host managed_node3 29922 1726853681.51498: done getting next task for host managed_node3 29922 1726853681.51500: ^ task is: TASK: meta (flush_handlers) 29922 1726853681.51501: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853681.51504: getting variables 29922 1726853681.51505: in VariableManager get_vars() 29922 1726853681.51518: Calling all_inventory to load vars for managed_node3 29922 1726853681.51520: Calling groups_inventory to load vars for managed_node3 29922 1726853681.51522: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853681.51527: Calling all_plugins_play to load vars for managed_node3 29922 1726853681.51530: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853681.51533: Calling groups_plugins_play to load vars for managed_node3 29922 1726853681.52889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853681.55546: done with get_vars() 29922 1726853681.55580: done getting variables 29922 1726853681.55748: in VariableManager get_vars() 29922 1726853681.55763: Calling all_inventory to load vars for managed_node3 29922 1726853681.55766: Calling groups_inventory to load vars for managed_node3 29922 1726853681.55768: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853681.55933: Calling all_plugins_play to load vars for managed_node3 29922 1726853681.55936: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853681.55940: Calling groups_plugins_play to load vars for managed_node3 29922 1726853681.57577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853681.60590: done with get_vars() 29922 1726853681.60626: done queuing things up, now waiting for results queue to drain 29922 1726853681.60629: results queue empty 29922 1726853681.60630: checking for any_errors_fatal 29922 1726853681.60631: done checking for any_errors_fatal 29922 1726853681.60638: checking for max_fail_percentage 29922 1726853681.60639: done checking for max_fail_percentage 29922 1726853681.60640: checking to see if all hosts have failed and the running result is not ok 29922 1726853681.60640: done checking to see if all hosts have failed 29922 1726853681.60641: getting the remaining hosts for this loop 29922 1726853681.60642: done getting the remaining hosts for this loop 29922 1726853681.60645: getting the next task for host managed_node3 29922 1726853681.60649: done getting next task for host managed_node3 29922 1726853681.60650: ^ task is: None 29922 1726853681.60652: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853681.60653: done queuing things up, now waiting for results queue to drain 29922 1726853681.60654: results queue empty 29922 1726853681.60657: checking for any_errors_fatal 29922 1726853681.60658: done checking for any_errors_fatal 29922 1726853681.60659: checking for max_fail_percentage 29922 1726853681.60665: done checking for max_fail_percentage 29922 1726853681.60666: checking to see if all hosts have failed and the running result is not ok 29922 1726853681.60666: done checking to see if all hosts have failed 29922 1726853681.60668: getting the next task for host managed_node3 29922 1726853681.60678: done getting next task for host managed_node3 29922 1726853681.60679: ^ task is: None 29922 1726853681.60681: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853681.60783: in VariableManager get_vars() 29922 1726853681.60820: done with get_vars() 29922 1726853681.60827: in VariableManager get_vars() 29922 1726853681.60838: done with get_vars() 29922 1726853681.60842: variable 'omit' from source: magic vars 29922 1726853681.60919: in VariableManager get_vars() 29922 1726853681.60932: done with get_vars() 29922 1726853681.61045: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 29922 1726853681.61535: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 29922 1726853681.61615: getting the remaining hosts for this loop 29922 1726853681.61617: done getting the remaining hosts for this loop 29922 1726853681.61621: getting the next task for host managed_node3 29922 1726853681.61626: done getting next task for host managed_node3 29922 1726853681.61628: ^ task is: TASK: Gathering Facts 29922 1726853681.61630: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853681.61632: getting variables 29922 1726853681.61632: in VariableManager get_vars() 29922 1726853681.61640: Calling all_inventory to load vars for managed_node3 29922 1726853681.61643: Calling groups_inventory to load vars for managed_node3 29922 1726853681.61645: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853681.61650: Calling all_plugins_play to load vars for managed_node3 29922 1726853681.61652: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853681.61657: Calling groups_plugins_play to load vars for managed_node3 29922 1726853681.64030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853681.67217: done with get_vars() 29922 1726853681.67250: done getting variables 29922 1726853681.67500: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 13:34:41 -0400 (0:00:00.770) 0:00:30.604 ****** 29922 1726853681.67532: entering _queue_task() for managed_node3/gather_facts 29922 1726853681.68292: worker is 1 (out of 1 available) 29922 1726853681.68304: exiting _queue_task() for managed_node3/gather_facts 29922 1726853681.68316: done queuing things up, now waiting for results queue to drain 29922 1726853681.68317: waiting for pending results... 29922 1726853681.68721: running TaskExecutor() for managed_node3/TASK: Gathering Facts 29922 1726853681.68910: in run() - task 02083763-bbaf-51d4-513b-00000000057e 29922 1726853681.68928: variable 'ansible_search_path' from source: unknown 29922 1726853681.68965: calling self._execute() 29922 1726853681.69249: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853681.69252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853681.69265: variable 'omit' from source: magic vars 29922 1726853681.70282: variable 'ansible_distribution_major_version' from source: facts 29922 1726853681.70376: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853681.70380: variable 'omit' from source: magic vars 29922 1726853681.70382: variable 'omit' from source: magic vars 29922 1726853681.70478: variable 'omit' from source: magic vars 29922 1726853681.70517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853681.70554: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853681.70741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853681.70848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853681.71010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853681.71045: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853681.71048: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853681.71080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853681.71309: Set connection var ansible_connection to ssh 29922 1726853681.71312: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853681.71315: Set connection var ansible_shell_executable to /bin/sh 29922 1726853681.71334: Set connection var ansible_pipelining to False 29922 1726853681.71337: Set connection var ansible_timeout to 10 29922 1726853681.71339: Set connection var ansible_shell_type to sh 29922 1726853681.71521: variable 'ansible_shell_executable' from source: unknown 29922 1726853681.71524: variable 'ansible_connection' from source: unknown 29922 1726853681.71531: variable 'ansible_module_compression' from source: unknown 29922 1726853681.71538: variable 'ansible_shell_type' from source: unknown 29922 1726853681.71544: variable 'ansible_shell_executable' from source: unknown 29922 1726853681.71547: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853681.71549: variable 'ansible_pipelining' from source: unknown 29922 1726853681.71552: variable 'ansible_timeout' from source: unknown 29922 1726853681.71554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853681.72157: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853681.72176: variable 'omit' from source: magic vars 29922 1726853681.72180: starting attempt loop 29922 1726853681.72205: running the handler 29922 1726853681.72295: variable 'ansible_facts' from source: unknown 29922 1726853681.72475: _low_level_execute_command(): starting 29922 1726853681.72478: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853681.74206: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853681.74406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853681.74445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853681.74603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853681.76301: stdout chunk (state=3): >>>/root <<< 29922 1726853681.76497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853681.76500: stdout chunk (state=3): >>><<< 29922 1726853681.76503: stderr chunk (state=3): >>><<< 29922 1726853681.76524: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853681.76550: _low_level_execute_command(): starting 29922 1726853681.76561: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853681.7653592-31364-192648150862437 `" && echo ansible-tmp-1726853681.7653592-31364-192648150862437="` echo /root/.ansible/tmp/ansible-tmp-1726853681.7653592-31364-192648150862437 `" ) && sleep 0' 29922 1726853681.78220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853681.78357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853681.78388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 29922 1726853681.78476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853681.78757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853681.78879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853681.80830: stdout chunk (state=3): >>>ansible-tmp-1726853681.7653592-31364-192648150862437=/root/.ansible/tmp/ansible-tmp-1726853681.7653592-31364-192648150862437 <<< 29922 1726853681.81004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853681.81011: stdout chunk (state=3): >>><<< 29922 1726853681.81015: stderr chunk (state=3): >>><<< 29922 1726853681.81393: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853681.7653592-31364-192648150862437=/root/.ansible/tmp/ansible-tmp-1726853681.7653592-31364-192648150862437 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853681.81398: variable 'ansible_module_compression' from source: unknown 29922 1726853681.81400: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 29922 1726853681.81581: variable 'ansible_facts' from source: unknown 29922 1726853681.82183: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853681.7653592-31364-192648150862437/AnsiballZ_setup.py 29922 1726853681.82503: Sending initial data 29922 1726853681.82511: Sent initial data (154 bytes) 29922 1726853681.83907: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853681.83922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853681.83938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853681.84148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853681.84183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853681.84335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853681.85961: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853681.86087: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853681.86152: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853681.7653592-31364-192648150862437/AnsiballZ_setup.py" <<< 29922 1726853681.86190: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp_kfwqo2g /root/.ansible/tmp/ansible-tmp-1726853681.7653592-31364-192648150862437/AnsiballZ_setup.py <<< 29922 1726853681.86194: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp_kfwqo2g" to remote "/root/.ansible/tmp/ansible-tmp-1726853681.7653592-31364-192648150862437/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853681.7653592-31364-192648150862437/AnsiballZ_setup.py" <<< 29922 1726853681.89462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853681.89468: stdout chunk (state=3): >>><<< 29922 1726853681.89480: stderr chunk (state=3): >>><<< 29922 1726853681.89484: done transferring module to remote 29922 1726853681.89486: _low_level_execute_command(): starting 29922 1726853681.89490: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853681.7653592-31364-192648150862437/ /root/.ansible/tmp/ansible-tmp-1726853681.7653592-31364-192648150862437/AnsiballZ_setup.py && sleep 0' 29922 1726853681.91013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853681.91017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853681.91053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853681.91064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853681.91251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853681.91355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853681.91383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853681.93295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853681.93584: stderr chunk (state=3): >>><<< 29922 1726853681.93587: stdout chunk (state=3): >>><<< 29922 1726853681.93590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853681.93592: _low_level_execute_command(): starting 29922 1726853681.93595: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853681.7653592-31364-192648150862437/AnsiballZ_setup.py && sleep 0' 29922 1726853681.95089: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853681.95157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853681.95205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853681.95306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853681.95425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853682.63940: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.5390625, "5m": 0.50146484375, "15m": 0.3076171875}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2985, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 546, "free": 2985}, "nocache": {"free": 3303, "used": 228}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 826, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261798653952, "block_size": 4096, "block_total": 65519099, "block_available": 63915687, "block_used": 1603412, "inode_total": 131070960, "inode_available": 131029145, "inode_used": 41815, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "42", "epoch": "1726853682", "epoch_int": "1726853682", "date": "2024-09-20", "time": "13:34:42", "iso8601_micro": "2024-09-20T17:34:42.562951Z", "iso8601": "2024-09-20T17:34:42Z", "iso8601_basic": "20240920T133442562951", "iso8601_basic_short": "20240920T133442", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "rpltstbr", "ethtest0", "lo", "peerethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "ae:20:01:6f:4b:76", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::ac20:1ff:fe6f:4b76", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "62:d7:bc:c2:71:2d", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::60d7:bcff:fec2:712d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9", "fe80::ac20:1ff:fe6f:4b76", "fe80::60d7:bcff:fec2:712d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9", "fe80::60d7:bcff:fec2:712d", "fe80::ac20:1ff:fe6f:4b76"]}, "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 29922 1726853682.66057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853682.66069: stdout chunk (state=3): >>><<< 29922 1726853682.66084: stderr chunk (state=3): >>><<< 29922 1726853682.66135: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.5390625, "5m": 0.50146484375, "15m": 0.3076171875}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2985, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 546, "free": 2985}, "nocache": {"free": 3303, "used": 228}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 826, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261798653952, "block_size": 4096, "block_total": 65519099, "block_available": 63915687, "block_used": 1603412, "inode_total": 131070960, "inode_available": 131029145, "inode_used": 41815, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "42", "epoch": "1726853682", "epoch_int": "1726853682", "date": "2024-09-20", "time": "13:34:42", "iso8601_micro": "2024-09-20T17:34:42.562951Z", "iso8601": "2024-09-20T17:34:42Z", "iso8601_basic": "20240920T133442562951", "iso8601_basic_short": "20240920T133442", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "rpltstbr", "ethtest0", "lo", "peerethtest0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_ethtest0": {"device": "ethtest0", "macaddress": "ae:20:01:6f:4b:76", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::ac20:1ff:fe6f:4b76", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "62:d7:bc:c2:71:2d", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::60d7:bcff:fec2:712d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9", "fe80::ac20:1ff:fe6f:4b76", "fe80::60d7:bcff:fec2:712d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9", "fe80::60d7:bcff:fec2:712d", "fe80::ac20:1ff:fe6f:4b76"]}, "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853682.66576: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853681.7653592-31364-192648150862437/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853682.66596: _low_level_execute_command(): starting 29922 1726853682.66599: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853681.7653592-31364-192648150862437/ > /dev/null 2>&1 && sleep 0' 29922 1726853682.67189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853682.67203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853682.67214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853682.67293: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853682.67313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853682.67336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853682.67349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853682.67598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853682.69559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853682.69675: stdout chunk (state=3): >>><<< 29922 1726853682.69680: stderr chunk (state=3): >>><<< 29922 1726853682.69682: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853682.69685: handler run complete 29922 1726853682.69792: variable 'ansible_facts' from source: unknown 29922 1726853682.69935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853682.70376: variable 'ansible_facts' from source: unknown 29922 1726853682.70490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853682.70651: attempt loop complete, returning result 29922 1726853682.70668: _execute() done 29922 1726853682.70680: dumping result to json 29922 1726853682.70777: done dumping result, returning 29922 1726853682.70780: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [02083763-bbaf-51d4-513b-00000000057e] 29922 1726853682.70782: sending task result for task 02083763-bbaf-51d4-513b-00000000057e 29922 1726853682.71643: done sending task result for task 02083763-bbaf-51d4-513b-00000000057e 29922 1726853682.71647: WORKER PROCESS EXITING ok: [managed_node3] 29922 1726853682.71979: no more pending results, returning what we have 29922 1726853682.71983: results queue empty 29922 1726853682.71983: checking for any_errors_fatal 29922 1726853682.71985: done checking for any_errors_fatal 29922 1726853682.71986: checking for max_fail_percentage 29922 1726853682.71987: done checking for max_fail_percentage 29922 1726853682.71988: checking to see if all hosts have failed and the running result is not ok 29922 1726853682.71989: done checking to see if all hosts have failed 29922 1726853682.71990: getting the remaining hosts for this loop 29922 1726853682.71991: done getting the remaining hosts for this loop 29922 1726853682.71994: getting the next task for host managed_node3 29922 1726853682.72000: done getting next task for host managed_node3 29922 1726853682.72002: ^ task is: TASK: meta (flush_handlers) 29922 1726853682.72004: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853682.72008: getting variables 29922 1726853682.72009: in VariableManager get_vars() 29922 1726853682.72032: Calling all_inventory to load vars for managed_node3 29922 1726853682.72034: Calling groups_inventory to load vars for managed_node3 29922 1726853682.72037: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853682.72047: Calling all_plugins_play to load vars for managed_node3 29922 1726853682.72050: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853682.72053: Calling groups_plugins_play to load vars for managed_node3 29922 1726853682.74638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853682.77914: done with get_vars() 29922 1726853682.77949: done getting variables 29922 1726853682.78069: in VariableManager get_vars() 29922 1726853682.78083: Calling all_inventory to load vars for managed_node3 29922 1726853682.78086: Calling groups_inventory to load vars for managed_node3 29922 1726853682.78088: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853682.78094: Calling all_plugins_play to load vars for managed_node3 29922 1726853682.78098: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853682.78101: Calling groups_plugins_play to load vars for managed_node3 29922 1726853682.80077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853682.81947: done with get_vars() 29922 1726853682.81988: done queuing things up, now waiting for results queue to drain 29922 1726853682.81990: results queue empty 29922 1726853682.81991: checking for any_errors_fatal 29922 1726853682.81995: done checking for any_errors_fatal 29922 1726853682.81996: checking for max_fail_percentage 29922 1726853682.82002: done checking for max_fail_percentage 29922 1726853682.82003: checking to see if all hosts have failed and the running result is not ok 29922 1726853682.82004: done checking to see if all hosts have failed 29922 1726853682.82004: getting the remaining hosts for this loop 29922 1726853682.82005: done getting the remaining hosts for this loop 29922 1726853682.82008: getting the next task for host managed_node3 29922 1726853682.82013: done getting next task for host managed_node3 29922 1726853682.82015: ^ task is: TASK: Include the task 'delete_interface.yml' 29922 1726853682.82017: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853682.82019: getting variables 29922 1726853682.82020: in VariableManager get_vars() 29922 1726853682.82030: Calling all_inventory to load vars for managed_node3 29922 1726853682.82032: Calling groups_inventory to load vars for managed_node3 29922 1726853682.82034: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853682.82040: Calling all_plugins_play to load vars for managed_node3 29922 1726853682.82042: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853682.82050: Calling groups_plugins_play to load vars for managed_node3 29922 1726853682.89809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853682.91769: done with get_vars() 29922 1726853682.91865: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 13:34:42 -0400 (0:00:01.244) 0:00:31.848 ****** 29922 1726853682.91943: entering _queue_task() for managed_node3/include_tasks 29922 1726853682.92388: worker is 1 (out of 1 available) 29922 1726853682.92399: exiting _queue_task() for managed_node3/include_tasks 29922 1726853682.92412: done queuing things up, now waiting for results queue to drain 29922 1726853682.92413: waiting for pending results... 29922 1726853682.92751: running TaskExecutor() for managed_node3/TASK: Include the task 'delete_interface.yml' 29922 1726853682.92809: in run() - task 02083763-bbaf-51d4-513b-000000000089 29922 1726853682.92827: variable 'ansible_search_path' from source: unknown 29922 1726853682.92876: calling self._execute() 29922 1726853682.92986: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853682.92998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853682.93012: variable 'omit' from source: magic vars 29922 1726853682.93499: variable 'ansible_distribution_major_version' from source: facts 29922 1726853682.93504: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853682.93507: _execute() done 29922 1726853682.93510: dumping result to json 29922 1726853682.93512: done dumping result, returning 29922 1726853682.93514: done running TaskExecutor() for managed_node3/TASK: Include the task 'delete_interface.yml' [02083763-bbaf-51d4-513b-000000000089] 29922 1726853682.93516: sending task result for task 02083763-bbaf-51d4-513b-000000000089 29922 1726853682.93677: done sending task result for task 02083763-bbaf-51d4-513b-000000000089 29922 1726853682.93681: WORKER PROCESS EXITING 29922 1726853682.93707: no more pending results, returning what we have 29922 1726853682.93713: in VariableManager get_vars() 29922 1726853682.93758: Calling all_inventory to load vars for managed_node3 29922 1726853682.93761: Calling groups_inventory to load vars for managed_node3 29922 1726853682.93764: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853682.93778: Calling all_plugins_play to load vars for managed_node3 29922 1726853682.93780: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853682.93783: Calling groups_plugins_play to load vars for managed_node3 29922 1726853682.95347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853682.97399: done with get_vars() 29922 1726853682.97429: variable 'ansible_search_path' from source: unknown 29922 1726853682.97461: we have included files to process 29922 1726853682.97462: generating all_blocks data 29922 1726853682.97464: done generating all_blocks data 29922 1726853682.97465: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 29922 1726853682.97466: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 29922 1726853682.97469: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 29922 1726853682.97734: done processing included file 29922 1726853682.97740: iterating over new_blocks loaded from include file 29922 1726853682.97742: in VariableManager get_vars() 29922 1726853682.97754: done with get_vars() 29922 1726853682.97756: filtering new block on tags 29922 1726853682.97775: done filtering new block on tags 29922 1726853682.97777: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node3 29922 1726853682.97791: extending task lists for all hosts with included blocks 29922 1726853682.97855: done extending task lists 29922 1726853682.97857: done processing included files 29922 1726853682.97857: results queue empty 29922 1726853682.97858: checking for any_errors_fatal 29922 1726853682.97860: done checking for any_errors_fatal 29922 1726853682.97861: checking for max_fail_percentage 29922 1726853682.97862: done checking for max_fail_percentage 29922 1726853682.97863: checking to see if all hosts have failed and the running result is not ok 29922 1726853682.97864: done checking to see if all hosts have failed 29922 1726853682.97864: getting the remaining hosts for this loop 29922 1726853682.97865: done getting the remaining hosts for this loop 29922 1726853682.97868: getting the next task for host managed_node3 29922 1726853682.97873: done getting next task for host managed_node3 29922 1726853682.97875: ^ task is: TASK: Remove test interface if necessary 29922 1726853682.97878: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853682.97880: getting variables 29922 1726853682.97881: in VariableManager get_vars() 29922 1726853682.97890: Calling all_inventory to load vars for managed_node3 29922 1726853682.97892: Calling groups_inventory to load vars for managed_node3 29922 1726853682.97895: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853682.97900: Calling all_plugins_play to load vars for managed_node3 29922 1726853682.97903: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853682.97906: Calling groups_plugins_play to load vars for managed_node3 29922 1726853682.99186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853683.01092: done with get_vars() 29922 1726853683.01119: done getting variables 29922 1726853683.01169: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 13:34:43 -0400 (0:00:00.092) 0:00:31.941 ****** 29922 1726853683.01200: entering _queue_task() for managed_node3/command 29922 1726853683.01612: worker is 1 (out of 1 available) 29922 1726853683.01624: exiting _queue_task() for managed_node3/command 29922 1726853683.01636: done queuing things up, now waiting for results queue to drain 29922 1726853683.01638: waiting for pending results... 29922 1726853683.01841: running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary 29922 1726853683.01970: in run() - task 02083763-bbaf-51d4-513b-00000000058f 29922 1726853683.02004: variable 'ansible_search_path' from source: unknown 29922 1726853683.02079: variable 'ansible_search_path' from source: unknown 29922 1726853683.02083: calling self._execute() 29922 1726853683.02179: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853683.02216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853683.02235: variable 'omit' from source: magic vars 29922 1726853683.02707: variable 'ansible_distribution_major_version' from source: facts 29922 1726853683.02732: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853683.02735: variable 'omit' from source: magic vars 29922 1726853683.02772: variable 'omit' from source: magic vars 29922 1726853683.02862: variable 'interface' from source: set_fact 29922 1726853683.02879: variable 'omit' from source: magic vars 29922 1726853683.02923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853683.02948: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853683.02966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853683.02988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853683.02992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853683.03016: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853683.03019: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853683.03022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853683.03100: Set connection var ansible_connection to ssh 29922 1726853683.03116: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853683.03119: Set connection var ansible_shell_executable to /bin/sh 29922 1726853683.03153: Set connection var ansible_pipelining to False 29922 1726853683.03158: Set connection var ansible_timeout to 10 29922 1726853683.03161: Set connection var ansible_shell_type to sh 29922 1726853683.03187: variable 'ansible_shell_executable' from source: unknown 29922 1726853683.03193: variable 'ansible_connection' from source: unknown 29922 1726853683.03195: variable 'ansible_module_compression' from source: unknown 29922 1726853683.03240: variable 'ansible_shell_type' from source: unknown 29922 1726853683.03243: variable 'ansible_shell_executable' from source: unknown 29922 1726853683.03245: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853683.03248: variable 'ansible_pipelining' from source: unknown 29922 1726853683.03258: variable 'ansible_timeout' from source: unknown 29922 1726853683.03264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853683.03393: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853683.03397: variable 'omit' from source: magic vars 29922 1726853683.03461: starting attempt loop 29922 1726853683.03465: running the handler 29922 1726853683.03467: _low_level_execute_command(): starting 29922 1726853683.03475: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853683.04332: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853683.04337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853683.04402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853683.06102: stdout chunk (state=3): >>>/root <<< 29922 1726853683.06206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853683.06242: stderr chunk (state=3): >>><<< 29922 1726853683.06246: stdout chunk (state=3): >>><<< 29922 1726853683.06280: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853683.06302: _low_level_execute_command(): starting 29922 1726853683.06316: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853683.06287-31419-28752920477469 `" && echo ansible-tmp-1726853683.06287-31419-28752920477469="` echo /root/.ansible/tmp/ansible-tmp-1726853683.06287-31419-28752920477469 `" ) && sleep 0' 29922 1726853683.06796: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853683.06800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853683.06802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853683.06805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853683.06807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853683.06821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853683.06825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853683.06845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853683.06848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853683.06924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853683.08892: stdout chunk (state=3): >>>ansible-tmp-1726853683.06287-31419-28752920477469=/root/.ansible/tmp/ansible-tmp-1726853683.06287-31419-28752920477469 <<< 29922 1726853683.09021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853683.09024: stdout chunk (state=3): >>><<< 29922 1726853683.09075: stderr chunk (state=3): >>><<< 29922 1726853683.09292: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853683.06287-31419-28752920477469=/root/.ansible/tmp/ansible-tmp-1726853683.06287-31419-28752920477469 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853683.09327: variable 'ansible_module_compression' from source: unknown 29922 1726853683.09399: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29922 1726853683.09433: variable 'ansible_facts' from source: unknown 29922 1726853683.09582: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853683.06287-31419-28752920477469/AnsiballZ_command.py 29922 1726853683.09937: Sending initial data 29922 1726853683.09942: Sent initial data (153 bytes) 29922 1726853683.10931: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853683.10982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853683.11010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853683.11074: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853683.11166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853683.12849: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853683.12930: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853683.12977: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpjuynoda8 /root/.ansible/tmp/ansible-tmp-1726853683.06287-31419-28752920477469/AnsiballZ_command.py <<< 29922 1726853683.12980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853683.06287-31419-28752920477469/AnsiballZ_command.py" <<< 29922 1726853683.13045: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpjuynoda8" to remote "/root/.ansible/tmp/ansible-tmp-1726853683.06287-31419-28752920477469/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853683.06287-31419-28752920477469/AnsiballZ_command.py" <<< 29922 1726853683.14675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853683.14828: stdout chunk (state=3): >>><<< 29922 1726853683.14831: stderr chunk (state=3): >>><<< 29922 1726853683.14833: done transferring module to remote 29922 1726853683.14849: _low_level_execute_command(): starting 29922 1726853683.14859: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853683.06287-31419-28752920477469/ /root/.ansible/tmp/ansible-tmp-1726853683.06287-31419-28752920477469/AnsiballZ_command.py && sleep 0' 29922 1726853683.16497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853683.16591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853683.16596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853683.16709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853683.16833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853683.19047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853683.19051: stdout chunk (state=3): >>><<< 29922 1726853683.19054: stderr chunk (state=3): >>><<< 29922 1726853683.19057: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853683.19059: _low_level_execute_command(): starting 29922 1726853683.19062: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853683.06287-31419-28752920477469/AnsiballZ_command.py && sleep 0' 29922 1726853683.19708: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853683.19716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853683.19719: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853683.19809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853683.19812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853683.20005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853683.36706: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 13:34:43.353874", "end": "2024-09-20 13:34:43.363990", "delta": "0:00:00.010116", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853683.39157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853683.39162: stdout chunk (state=3): >>><<< 29922 1726853683.39164: stderr chunk (state=3): >>><<< 29922 1726853683.39414: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 13:34:43.353874", "end": "2024-09-20 13:34:43.363990", "delta": "0:00:00.010116", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853683.39419: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853683.06287-31419-28752920477469/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853683.39421: _low_level_execute_command(): starting 29922 1726853683.39423: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853683.06287-31419-28752920477469/ > /dev/null 2>&1 && sleep 0' 29922 1726853683.40655: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853683.40969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853683.41288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853683.42912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853683.42940: stderr chunk (state=3): >>><<< 29922 1726853683.42949: stdout chunk (state=3): >>><<< 29922 1726853683.42999: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853683.43027: handler run complete 29922 1726853683.43055: Evaluated conditional (False): False 29922 1726853683.43141: attempt loop complete, returning result 29922 1726853683.43149: _execute() done 29922 1726853683.43156: dumping result to json 29922 1726853683.43165: done dumping result, returning 29922 1726853683.43181: done running TaskExecutor() for managed_node3/TASK: Remove test interface if necessary [02083763-bbaf-51d4-513b-00000000058f] 29922 1726853683.43190: sending task result for task 02083763-bbaf-51d4-513b-00000000058f ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.010116", "end": "2024-09-20 13:34:43.363990", "rc": 0, "start": "2024-09-20 13:34:43.353874" } 29922 1726853683.43409: no more pending results, returning what we have 29922 1726853683.43413: results queue empty 29922 1726853683.43414: checking for any_errors_fatal 29922 1726853683.43416: done checking for any_errors_fatal 29922 1726853683.43416: checking for max_fail_percentage 29922 1726853683.43418: done checking for max_fail_percentage 29922 1726853683.43419: checking to see if all hosts have failed and the running result is not ok 29922 1726853683.43420: done checking to see if all hosts have failed 29922 1726853683.43421: getting the remaining hosts for this loop 29922 1726853683.43422: done getting the remaining hosts for this loop 29922 1726853683.43426: getting the next task for host managed_node3 29922 1726853683.43436: done getting next task for host managed_node3 29922 1726853683.43438: ^ task is: TASK: meta (flush_handlers) 29922 1726853683.43441: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853683.43447: getting variables 29922 1726853683.43449: in VariableManager get_vars() 29922 1726853683.43483: Calling all_inventory to load vars for managed_node3 29922 1726853683.43486: Calling groups_inventory to load vars for managed_node3 29922 1726853683.43490: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853683.43503: Calling all_plugins_play to load vars for managed_node3 29922 1726853683.43506: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853683.43509: Calling groups_plugins_play to load vars for managed_node3 29922 1726853683.44378: done sending task result for task 02083763-bbaf-51d4-513b-00000000058f 29922 1726853683.44382: WORKER PROCESS EXITING 29922 1726853683.46818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853683.49380: done with get_vars() 29922 1726853683.49410: done getting variables 29922 1726853683.49492: in VariableManager get_vars() 29922 1726853683.49503: Calling all_inventory to load vars for managed_node3 29922 1726853683.49505: Calling groups_inventory to load vars for managed_node3 29922 1726853683.49508: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853683.49513: Calling all_plugins_play to load vars for managed_node3 29922 1726853683.49515: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853683.49518: Calling groups_plugins_play to load vars for managed_node3 29922 1726853683.50828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853683.53320: done with get_vars() 29922 1726853683.53356: done queuing things up, now waiting for results queue to drain 29922 1726853683.53358: results queue empty 29922 1726853683.53359: checking for any_errors_fatal 29922 1726853683.53362: done checking for any_errors_fatal 29922 1726853683.53363: checking for max_fail_percentage 29922 1726853683.53364: done checking for max_fail_percentage 29922 1726853683.53365: checking to see if all hosts have failed and the running result is not ok 29922 1726853683.53365: done checking to see if all hosts have failed 29922 1726853683.53366: getting the remaining hosts for this loop 29922 1726853683.53367: done getting the remaining hosts for this loop 29922 1726853683.53370: getting the next task for host managed_node3 29922 1726853683.53478: done getting next task for host managed_node3 29922 1726853683.53481: ^ task is: TASK: meta (flush_handlers) 29922 1726853683.53482: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853683.53485: getting variables 29922 1726853683.53487: in VariableManager get_vars() 29922 1726853683.53497: Calling all_inventory to load vars for managed_node3 29922 1726853683.53499: Calling groups_inventory to load vars for managed_node3 29922 1726853683.53501: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853683.53507: Calling all_plugins_play to load vars for managed_node3 29922 1726853683.53509: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853683.53512: Calling groups_plugins_play to load vars for managed_node3 29922 1726853683.55013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853683.56623: done with get_vars() 29922 1726853683.56658: done getting variables 29922 1726853683.56728: in VariableManager get_vars() 29922 1726853683.56737: Calling all_inventory to load vars for managed_node3 29922 1726853683.56740: Calling groups_inventory to load vars for managed_node3 29922 1726853683.56742: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853683.56747: Calling all_plugins_play to load vars for managed_node3 29922 1726853683.56755: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853683.56760: Calling groups_plugins_play to load vars for managed_node3 29922 1726853683.59567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853683.61533: done with get_vars() 29922 1726853683.61594: done queuing things up, now waiting for results queue to drain 29922 1726853683.61602: results queue empty 29922 1726853683.61617: checking for any_errors_fatal 29922 1726853683.61619: done checking for any_errors_fatal 29922 1726853683.61620: checking for max_fail_percentage 29922 1726853683.61621: done checking for max_fail_percentage 29922 1726853683.61622: checking to see if all hosts have failed and the running result is not ok 29922 1726853683.61623: done checking to see if all hosts have failed 29922 1726853683.61623: getting the remaining hosts for this loop 29922 1726853683.61624: done getting the remaining hosts for this loop 29922 1726853683.61628: getting the next task for host managed_node3 29922 1726853683.61632: done getting next task for host managed_node3 29922 1726853683.61633: ^ task is: None 29922 1726853683.61635: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853683.61636: done queuing things up, now waiting for results queue to drain 29922 1726853683.61637: results queue empty 29922 1726853683.61638: checking for any_errors_fatal 29922 1726853683.61638: done checking for any_errors_fatal 29922 1726853683.61639: checking for max_fail_percentage 29922 1726853683.61640: done checking for max_fail_percentage 29922 1726853683.61641: checking to see if all hosts have failed and the running result is not ok 29922 1726853683.61642: done checking to see if all hosts have failed 29922 1726853683.61643: getting the next task for host managed_node3 29922 1726853683.61645: done getting next task for host managed_node3 29922 1726853683.61646: ^ task is: None 29922 1726853683.61647: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853683.61749: in VariableManager get_vars() 29922 1726853683.61781: done with get_vars() 29922 1726853683.61789: in VariableManager get_vars() 29922 1726853683.61803: done with get_vars() 29922 1726853683.61812: variable 'omit' from source: magic vars 29922 1726853683.61968: variable 'profile' from source: play vars 29922 1726853683.62535: in VariableManager get_vars() 29922 1726853683.62558: done with get_vars() 29922 1726853683.62706: variable 'omit' from source: magic vars 29922 1726853683.62925: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 29922 1726853683.64652: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 29922 1726853683.64721: getting the remaining hosts for this loop 29922 1726853683.64722: done getting the remaining hosts for this loop 29922 1726853683.64725: getting the next task for host managed_node3 29922 1726853683.64727: done getting next task for host managed_node3 29922 1726853683.64729: ^ task is: TASK: Gathering Facts 29922 1726853683.64731: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853683.64733: getting variables 29922 1726853683.64734: in VariableManager get_vars() 29922 1726853683.64747: Calling all_inventory to load vars for managed_node3 29922 1726853683.64750: Calling groups_inventory to load vars for managed_node3 29922 1726853683.64752: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853683.64760: Calling all_plugins_play to load vars for managed_node3 29922 1726853683.64762: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853683.64813: Calling groups_plugins_play to load vars for managed_node3 29922 1726853683.66678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853683.68158: done with get_vars() 29922 1726853683.68188: done getting variables 29922 1726853683.68239: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 13:34:43 -0400 (0:00:00.670) 0:00:32.612 ****** 29922 1726853683.68279: entering _queue_task() for managed_node3/gather_facts 29922 1726853683.68603: worker is 1 (out of 1 available) 29922 1726853683.68615: exiting _queue_task() for managed_node3/gather_facts 29922 1726853683.68626: done queuing things up, now waiting for results queue to drain 29922 1726853683.68628: waiting for pending results... 29922 1726853683.68812: running TaskExecutor() for managed_node3/TASK: Gathering Facts 29922 1726853683.68880: in run() - task 02083763-bbaf-51d4-513b-00000000059d 29922 1726853683.68891: variable 'ansible_search_path' from source: unknown 29922 1726853683.68923: calling self._execute() 29922 1726853683.69017: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853683.69022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853683.69029: variable 'omit' from source: magic vars 29922 1726853683.69689: variable 'ansible_distribution_major_version' from source: facts 29922 1726853683.69694: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853683.69697: variable 'omit' from source: magic vars 29922 1726853683.69699: variable 'omit' from source: magic vars 29922 1726853683.69806: variable 'omit' from source: magic vars 29922 1726853683.69997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853683.70088: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853683.70169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853683.70309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853683.70343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853683.70465: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853683.70506: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853683.70526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853683.70630: Set connection var ansible_connection to ssh 29922 1726853683.70679: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853683.70686: Set connection var ansible_shell_executable to /bin/sh 29922 1726853683.70692: Set connection var ansible_pipelining to False 29922 1726853683.70695: Set connection var ansible_timeout to 10 29922 1726853683.70697: Set connection var ansible_shell_type to sh 29922 1726853683.70719: variable 'ansible_shell_executable' from source: unknown 29922 1726853683.70726: variable 'ansible_connection' from source: unknown 29922 1726853683.70789: variable 'ansible_module_compression' from source: unknown 29922 1726853683.70794: variable 'ansible_shell_type' from source: unknown 29922 1726853683.70802: variable 'ansible_shell_executable' from source: unknown 29922 1726853683.70805: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853683.70807: variable 'ansible_pipelining' from source: unknown 29922 1726853683.70810: variable 'ansible_timeout' from source: unknown 29922 1726853683.70812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853683.71023: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853683.71112: variable 'omit' from source: magic vars 29922 1726853683.71116: starting attempt loop 29922 1726853683.71119: running the handler 29922 1726853683.71121: variable 'ansible_facts' from source: unknown 29922 1726853683.71124: _low_level_execute_command(): starting 29922 1726853683.71126: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853683.72102: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853683.72162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853683.72200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853683.72230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853683.72252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853683.72307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853683.74030: stdout chunk (state=3): >>>/root <<< 29922 1726853683.74178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853683.74215: stderr chunk (state=3): >>><<< 29922 1726853683.74249: stdout chunk (state=3): >>><<< 29922 1726853683.74286: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853683.74379: _low_level_execute_command(): starting 29922 1726853683.74383: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853683.7429366-31462-162195012486395 `" && echo ansible-tmp-1726853683.7429366-31462-162195012486395="` echo /root/.ansible/tmp/ansible-tmp-1726853683.7429366-31462-162195012486395 `" ) && sleep 0' 29922 1726853683.75102: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853683.75117: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853683.75140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853683.75166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853683.75190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853683.75204: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853683.75247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853683.75264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853683.75304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853683.75389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853683.75408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853683.75429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853683.75606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853683.77591: stdout chunk (state=3): >>>ansible-tmp-1726853683.7429366-31462-162195012486395=/root/.ansible/tmp/ansible-tmp-1726853683.7429366-31462-162195012486395 <<< 29922 1726853683.77792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853683.77841: stdout chunk (state=3): >>><<< 29922 1726853683.77844: stderr chunk (state=3): >>><<< 29922 1726853683.77915: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853683.7429366-31462-162195012486395=/root/.ansible/tmp/ansible-tmp-1726853683.7429366-31462-162195012486395 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853683.78009: variable 'ansible_module_compression' from source: unknown 29922 1726853683.78109: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 29922 1726853683.78395: variable 'ansible_facts' from source: unknown 29922 1726853683.78451: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853683.7429366-31462-162195012486395/AnsiballZ_setup.py 29922 1726853683.78693: Sending initial data 29922 1726853683.78703: Sent initial data (154 bytes) 29922 1726853683.79779: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853683.79837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853683.79921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853683.80045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853683.80114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853683.80248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853683.81944: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853683.81995: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853683.82099: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpkg7vnkz6 /root/.ansible/tmp/ansible-tmp-1726853683.7429366-31462-162195012486395/AnsiballZ_setup.py <<< 29922 1726853683.82102: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853683.7429366-31462-162195012486395/AnsiballZ_setup.py" <<< 29922 1726853683.82152: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpkg7vnkz6" to remote "/root/.ansible/tmp/ansible-tmp-1726853683.7429366-31462-162195012486395/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853683.7429366-31462-162195012486395/AnsiballZ_setup.py" <<< 29922 1726853683.83873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853683.83996: stdout chunk (state=3): >>><<< 29922 1726853683.83999: stderr chunk (state=3): >>><<< 29922 1726853683.84002: done transferring module to remote 29922 1726853683.84004: _low_level_execute_command(): starting 29922 1726853683.84007: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853683.7429366-31462-162195012486395/ /root/.ansible/tmp/ansible-tmp-1726853683.7429366-31462-162195012486395/AnsiballZ_setup.py && sleep 0' 29922 1726853683.84770: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853683.84777: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853683.84982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853683.85000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853683.85044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853683.85174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853683.87105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853683.87110: stdout chunk (state=3): >>><<< 29922 1726853683.87112: stderr chunk (state=3): >>><<< 29922 1726853683.87130: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853683.87139: _low_level_execute_command(): starting 29922 1726853683.87151: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853683.7429366-31462-162195012486395/AnsiballZ_setup.py && sleep 0' 29922 1726853683.88215: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853683.88229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853683.88335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853683.88348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853683.88389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853683.88495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853683.88573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853683.88639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853684.52394: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.5390625, "5m": 0.50146484375, "15m": 0.3076171875}, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_local": {}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "44", "epoch": "1726853684", "epoch_int": "1726853684", "date": "2024-09-20", "time": "13:34:44", "iso8601_micro": "2024-09-20T17:34:44.169228Z", "iso8601": "2024-09-20T17:34:44Z", "iso8601_basic": "20240920T133444169228", "iso8601_basic_short": "20240920T133444", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "o<<< 29922 1726853684.52461: stdout chunk (state=3): >>>ff [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2983, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 548, "free": 2983}, "nocache": {"free": 3301, "used": 230}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 828, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261798682624, "block_size": 4096, "block_total": 65519099, "block_available": 63915694, "block_used": 1603405, "inode_total": 131070960, "inode_available": 131029145, "inode_used": 41815, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 29922 1726853684.54453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853684.54635: stderr chunk (state=3): >>><<< 29922 1726853684.54639: stdout chunk (state=3): >>><<< 29922 1726853684.54974: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.5390625, "5m": 0.50146484375, "15m": 0.3076171875}, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_local": {}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "44", "epoch": "1726853684", "epoch_int": "1726853684", "date": "2024-09-20", "time": "13:34:44", "iso8601_micro": "2024-09-20T17:34:44.169228Z", "iso8601": "2024-09-20T17:34:44Z", "iso8601_basic": "20240920T133444169228", "iso8601_basic_short": "20240920T133444", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_interfaces": ["rpltstbr", "lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2983, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 548, "free": 2983}, "nocache": {"free": 3301, "used": 230}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 828, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261798682624, "block_size": 4096, "block_total": 65519099, "block_available": 63915694, "block_used": 1603405, "inode_total": 131070960, "inode_available": 131029145, "inode_used": 41815, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853684.55707: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853683.7429366-31462-162195012486395/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853684.55733: _low_level_execute_command(): starting 29922 1726853684.55944: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853683.7429366-31462-162195012486395/ > /dev/null 2>&1 && sleep 0' 29922 1726853684.56894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853684.56908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 29922 1726853684.56923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853684.56976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853684.56987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853684.57289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853684.57400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853684.59283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853684.59316: stderr chunk (state=3): >>><<< 29922 1726853684.59325: stdout chunk (state=3): >>><<< 29922 1726853684.59349: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853684.59391: handler run complete 29922 1726853684.59679: variable 'ansible_facts' from source: unknown 29922 1726853684.59899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853684.60790: variable 'ansible_facts' from source: unknown 29922 1726853684.60921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853684.61345: attempt loop complete, returning result 29922 1726853684.61355: _execute() done 29922 1726853684.61364: dumping result to json 29922 1726853684.61405: done dumping result, returning 29922 1726853684.61425: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [02083763-bbaf-51d4-513b-00000000059d] 29922 1726853684.61462: sending task result for task 02083763-bbaf-51d4-513b-00000000059d ok: [managed_node3] 29922 1726853684.63156: no more pending results, returning what we have 29922 1726853684.63160: results queue empty 29922 1726853684.63161: checking for any_errors_fatal 29922 1726853684.63162: done checking for any_errors_fatal 29922 1726853684.63163: checking for max_fail_percentage 29922 1726853684.63164: done checking for max_fail_percentage 29922 1726853684.63165: checking to see if all hosts have failed and the running result is not ok 29922 1726853684.63166: done checking to see if all hosts have failed 29922 1726853684.63167: getting the remaining hosts for this loop 29922 1726853684.63168: done getting the remaining hosts for this loop 29922 1726853684.63173: getting the next task for host managed_node3 29922 1726853684.63178: done getting next task for host managed_node3 29922 1726853684.63180: ^ task is: TASK: meta (flush_handlers) 29922 1726853684.63182: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853684.63186: getting variables 29922 1726853684.63187: in VariableManager get_vars() 29922 1726853684.63215: Calling all_inventory to load vars for managed_node3 29922 1726853684.63217: Calling groups_inventory to load vars for managed_node3 29922 1726853684.63219: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853684.63290: done sending task result for task 02083763-bbaf-51d4-513b-00000000059d 29922 1726853684.63294: WORKER PROCESS EXITING 29922 1726853684.63304: Calling all_plugins_play to load vars for managed_node3 29922 1726853684.63307: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853684.63311: Calling groups_plugins_play to load vars for managed_node3 29922 1726853684.65422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853684.69095: done with get_vars() 29922 1726853684.69129: done getting variables 29922 1726853684.69326: in VariableManager get_vars() 29922 1726853684.69340: Calling all_inventory to load vars for managed_node3 29922 1726853684.69343: Calling groups_inventory to load vars for managed_node3 29922 1726853684.69345: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853684.69350: Calling all_plugins_play to load vars for managed_node3 29922 1726853684.69459: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853684.69469: Calling groups_plugins_play to load vars for managed_node3 29922 1726853684.71229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853684.73083: done with get_vars() 29922 1726853684.73111: done queuing things up, now waiting for results queue to drain 29922 1726853684.73114: results queue empty 29922 1726853684.73114: checking for any_errors_fatal 29922 1726853684.73118: done checking for any_errors_fatal 29922 1726853684.73119: checking for max_fail_percentage 29922 1726853684.73120: done checking for max_fail_percentage 29922 1726853684.73125: checking to see if all hosts have failed and the running result is not ok 29922 1726853684.73126: done checking to see if all hosts have failed 29922 1726853684.73127: getting the remaining hosts for this loop 29922 1726853684.73127: done getting the remaining hosts for this loop 29922 1726853684.73130: getting the next task for host managed_node3 29922 1726853684.73134: done getting next task for host managed_node3 29922 1726853684.73137: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 29922 1726853684.73138: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853684.73147: getting variables 29922 1726853684.73148: in VariableManager get_vars() 29922 1726853684.73161: Calling all_inventory to load vars for managed_node3 29922 1726853684.73168: Calling groups_inventory to load vars for managed_node3 29922 1726853684.73172: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853684.73177: Calling all_plugins_play to load vars for managed_node3 29922 1726853684.73179: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853684.73181: Calling groups_plugins_play to load vars for managed_node3 29922 1726853684.74598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853684.76342: done with get_vars() 29922 1726853684.76372: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:34:44 -0400 (0:00:01.081) 0:00:33.694 ****** 29922 1726853684.76450: entering _queue_task() for managed_node3/include_tasks 29922 1726853684.76907: worker is 1 (out of 1 available) 29922 1726853684.76919: exiting _queue_task() for managed_node3/include_tasks 29922 1726853684.76928: done queuing things up, now waiting for results queue to drain 29922 1726853684.76929: waiting for pending results... 29922 1726853684.77164: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 29922 1726853684.77292: in run() - task 02083763-bbaf-51d4-513b-000000000091 29922 1726853684.77314: variable 'ansible_search_path' from source: unknown 29922 1726853684.77323: variable 'ansible_search_path' from source: unknown 29922 1726853684.77370: calling self._execute() 29922 1726853684.77487: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853684.77505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853684.77521: variable 'omit' from source: magic vars 29922 1726853684.77967: variable 'ansible_distribution_major_version' from source: facts 29922 1726853684.77999: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853684.78040: _execute() done 29922 1726853684.78043: dumping result to json 29922 1726853684.78046: done dumping result, returning 29922 1726853684.78048: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-51d4-513b-000000000091] 29922 1726853684.78051: sending task result for task 02083763-bbaf-51d4-513b-000000000091 29922 1726853684.78238: done sending task result for task 02083763-bbaf-51d4-513b-000000000091 29922 1726853684.78241: WORKER PROCESS EXITING 29922 1726853684.78316: no more pending results, returning what we have 29922 1726853684.78322: in VariableManager get_vars() 29922 1726853684.78370: Calling all_inventory to load vars for managed_node3 29922 1726853684.78578: Calling groups_inventory to load vars for managed_node3 29922 1726853684.78582: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853684.78591: Calling all_plugins_play to load vars for managed_node3 29922 1726853684.78593: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853684.78596: Calling groups_plugins_play to load vars for managed_node3 29922 1726853684.80327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853684.81256: done with get_vars() 29922 1726853684.81276: variable 'ansible_search_path' from source: unknown 29922 1726853684.81277: variable 'ansible_search_path' from source: unknown 29922 1726853684.81300: we have included files to process 29922 1726853684.81301: generating all_blocks data 29922 1726853684.81303: done generating all_blocks data 29922 1726853684.81303: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29922 1726853684.81304: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29922 1726853684.81307: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29922 1726853684.81700: done processing included file 29922 1726853684.81702: iterating over new_blocks loaded from include file 29922 1726853684.81704: in VariableManager get_vars() 29922 1726853684.81741: done with get_vars() 29922 1726853684.81743: filtering new block on tags 29922 1726853684.81754: done filtering new block on tags 29922 1726853684.81756: in VariableManager get_vars() 29922 1726853684.81769: done with get_vars() 29922 1726853684.81773: filtering new block on tags 29922 1726853684.81784: done filtering new block on tags 29922 1726853684.81786: in VariableManager get_vars() 29922 1726853684.81803: done with get_vars() 29922 1726853684.81813: filtering new block on tags 29922 1726853684.81828: done filtering new block on tags 29922 1726853684.81830: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 29922 1726853684.81834: extending task lists for all hosts with included blocks 29922 1726853684.82194: done extending task lists 29922 1726853684.82195: done processing included files 29922 1726853684.82196: results queue empty 29922 1726853684.82197: checking for any_errors_fatal 29922 1726853684.82198: done checking for any_errors_fatal 29922 1726853684.82199: checking for max_fail_percentage 29922 1726853684.82200: done checking for max_fail_percentage 29922 1726853684.82200: checking to see if all hosts have failed and the running result is not ok 29922 1726853684.82201: done checking to see if all hosts have failed 29922 1726853684.82202: getting the remaining hosts for this loop 29922 1726853684.82203: done getting the remaining hosts for this loop 29922 1726853684.82205: getting the next task for host managed_node3 29922 1726853684.82208: done getting next task for host managed_node3 29922 1726853684.82210: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 29922 1726853684.82212: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853684.82221: getting variables 29922 1726853684.82222: in VariableManager get_vars() 29922 1726853684.82235: Calling all_inventory to load vars for managed_node3 29922 1726853684.82238: Calling groups_inventory to load vars for managed_node3 29922 1726853684.82239: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853684.82244: Calling all_plugins_play to load vars for managed_node3 29922 1726853684.82246: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853684.82249: Calling groups_plugins_play to load vars for managed_node3 29922 1726853684.83299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853684.84895: done with get_vars() 29922 1726853684.84913: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:34:44 -0400 (0:00:00.085) 0:00:33.779 ****** 29922 1726853684.84967: entering _queue_task() for managed_node3/setup 29922 1726853684.85224: worker is 1 (out of 1 available) 29922 1726853684.85238: exiting _queue_task() for managed_node3/setup 29922 1726853684.85250: done queuing things up, now waiting for results queue to drain 29922 1726853684.85251: waiting for pending results... 29922 1726853684.85433: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 29922 1726853684.85519: in run() - task 02083763-bbaf-51d4-513b-0000000005de 29922 1726853684.85531: variable 'ansible_search_path' from source: unknown 29922 1726853684.85534: variable 'ansible_search_path' from source: unknown 29922 1726853684.85564: calling self._execute() 29922 1726853684.85642: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853684.85646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853684.85654: variable 'omit' from source: magic vars 29922 1726853684.85938: variable 'ansible_distribution_major_version' from source: facts 29922 1726853684.85949: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853684.86099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853684.88036: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853684.88079: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853684.88111: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853684.88136: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853684.88158: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853684.88217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853684.88238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853684.88258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853684.88284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853684.88296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853684.88341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853684.88360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853684.88376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853684.88401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853684.88411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853684.88524: variable '__network_required_facts' from source: role '' defaults 29922 1726853684.88536: variable 'ansible_facts' from source: unknown 29922 1726853684.88973: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 29922 1726853684.88977: when evaluation is False, skipping this task 29922 1726853684.88979: _execute() done 29922 1726853684.88981: dumping result to json 29922 1726853684.88984: done dumping result, returning 29922 1726853684.88988: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-51d4-513b-0000000005de] 29922 1726853684.88990: sending task result for task 02083763-bbaf-51d4-513b-0000000005de 29922 1726853684.89068: done sending task result for task 02083763-bbaf-51d4-513b-0000000005de 29922 1726853684.89070: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29922 1726853684.89115: no more pending results, returning what we have 29922 1726853684.89118: results queue empty 29922 1726853684.89119: checking for any_errors_fatal 29922 1726853684.89120: done checking for any_errors_fatal 29922 1726853684.89121: checking for max_fail_percentage 29922 1726853684.89122: done checking for max_fail_percentage 29922 1726853684.89123: checking to see if all hosts have failed and the running result is not ok 29922 1726853684.89124: done checking to see if all hosts have failed 29922 1726853684.89125: getting the remaining hosts for this loop 29922 1726853684.89126: done getting the remaining hosts for this loop 29922 1726853684.89129: getting the next task for host managed_node3 29922 1726853684.89136: done getting next task for host managed_node3 29922 1726853684.89139: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 29922 1726853684.89142: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853684.89158: getting variables 29922 1726853684.89159: in VariableManager get_vars() 29922 1726853684.89197: Calling all_inventory to load vars for managed_node3 29922 1726853684.89200: Calling groups_inventory to load vars for managed_node3 29922 1726853684.89202: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853684.89212: Calling all_plugins_play to load vars for managed_node3 29922 1726853684.89214: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853684.89217: Calling groups_plugins_play to load vars for managed_node3 29922 1726853684.90296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853684.91758: done with get_vars() 29922 1726853684.91785: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:34:44 -0400 (0:00:00.069) 0:00:33.848 ****** 29922 1726853684.91870: entering _queue_task() for managed_node3/stat 29922 1726853684.92135: worker is 1 (out of 1 available) 29922 1726853684.92148: exiting _queue_task() for managed_node3/stat 29922 1726853684.92163: done queuing things up, now waiting for results queue to drain 29922 1726853684.92164: waiting for pending results... 29922 1726853684.92345: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 29922 1726853684.92437: in run() - task 02083763-bbaf-51d4-513b-0000000005e0 29922 1726853684.92448: variable 'ansible_search_path' from source: unknown 29922 1726853684.92456: variable 'ansible_search_path' from source: unknown 29922 1726853684.92491: calling self._execute() 29922 1726853684.92575: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853684.92580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853684.92586: variable 'omit' from source: magic vars 29922 1726853684.92867: variable 'ansible_distribution_major_version' from source: facts 29922 1726853684.92878: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853684.92996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853684.93197: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853684.93230: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853684.93256: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853684.93285: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853684.93351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853684.93374: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853684.93392: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853684.93409: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853684.93478: variable '__network_is_ostree' from source: set_fact 29922 1726853684.93484: Evaluated conditional (not __network_is_ostree is defined): False 29922 1726853684.93486: when evaluation is False, skipping this task 29922 1726853684.93489: _execute() done 29922 1726853684.93491: dumping result to json 29922 1726853684.93494: done dumping result, returning 29922 1726853684.93501: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-51d4-513b-0000000005e0] 29922 1726853684.93506: sending task result for task 02083763-bbaf-51d4-513b-0000000005e0 29922 1726853684.93595: done sending task result for task 02083763-bbaf-51d4-513b-0000000005e0 29922 1726853684.93597: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 29922 1726853684.93642: no more pending results, returning what we have 29922 1726853684.93646: results queue empty 29922 1726853684.93647: checking for any_errors_fatal 29922 1726853684.93655: done checking for any_errors_fatal 29922 1726853684.93655: checking for max_fail_percentage 29922 1726853684.93657: done checking for max_fail_percentage 29922 1726853684.93658: checking to see if all hosts have failed and the running result is not ok 29922 1726853684.93659: done checking to see if all hosts have failed 29922 1726853684.93659: getting the remaining hosts for this loop 29922 1726853684.93661: done getting the remaining hosts for this loop 29922 1726853684.93664: getting the next task for host managed_node3 29922 1726853684.93670: done getting next task for host managed_node3 29922 1726853684.93675: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 29922 1726853684.93678: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853684.93701: getting variables 29922 1726853684.93702: in VariableManager get_vars() 29922 1726853684.93737: Calling all_inventory to load vars for managed_node3 29922 1726853684.93739: Calling groups_inventory to load vars for managed_node3 29922 1726853684.93741: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853684.93750: Calling all_plugins_play to load vars for managed_node3 29922 1726853684.93753: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853684.93755: Calling groups_plugins_play to load vars for managed_node3 29922 1726853684.94567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853684.95547: done with get_vars() 29922 1726853684.95563: done getting variables 29922 1726853684.95606: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:34:44 -0400 (0:00:00.037) 0:00:33.885 ****** 29922 1726853684.95633: entering _queue_task() for managed_node3/set_fact 29922 1726853684.95883: worker is 1 (out of 1 available) 29922 1726853684.95896: exiting _queue_task() for managed_node3/set_fact 29922 1726853684.95908: done queuing things up, now waiting for results queue to drain 29922 1726853684.95909: waiting for pending results... 29922 1726853684.96094: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 29922 1726853684.96183: in run() - task 02083763-bbaf-51d4-513b-0000000005e1 29922 1726853684.96202: variable 'ansible_search_path' from source: unknown 29922 1726853684.96212: variable 'ansible_search_path' from source: unknown 29922 1726853684.96241: calling self._execute() 29922 1726853684.96322: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853684.96327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853684.96336: variable 'omit' from source: magic vars 29922 1726853684.96614: variable 'ansible_distribution_major_version' from source: facts 29922 1726853684.96624: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853684.96739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853684.96939: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853684.96976: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853684.97002: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853684.97025: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853684.97094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853684.97114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853684.97132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853684.97149: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853684.97215: variable '__network_is_ostree' from source: set_fact 29922 1726853684.97219: Evaluated conditional (not __network_is_ostree is defined): False 29922 1726853684.97222: when evaluation is False, skipping this task 29922 1726853684.97224: _execute() done 29922 1726853684.97227: dumping result to json 29922 1726853684.97232: done dumping result, returning 29922 1726853684.97239: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-51d4-513b-0000000005e1] 29922 1726853684.97243: sending task result for task 02083763-bbaf-51d4-513b-0000000005e1 29922 1726853684.97324: done sending task result for task 02083763-bbaf-51d4-513b-0000000005e1 29922 1726853684.97327: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 29922 1726853684.97378: no more pending results, returning what we have 29922 1726853684.97381: results queue empty 29922 1726853684.97382: checking for any_errors_fatal 29922 1726853684.97388: done checking for any_errors_fatal 29922 1726853684.97388: checking for max_fail_percentage 29922 1726853684.97390: done checking for max_fail_percentage 29922 1726853684.97391: checking to see if all hosts have failed and the running result is not ok 29922 1726853684.97392: done checking to see if all hosts have failed 29922 1726853684.97392: getting the remaining hosts for this loop 29922 1726853684.97393: done getting the remaining hosts for this loop 29922 1726853684.97397: getting the next task for host managed_node3 29922 1726853684.97406: done getting next task for host managed_node3 29922 1726853684.97409: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 29922 1726853684.97412: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853684.97425: getting variables 29922 1726853684.97427: in VariableManager get_vars() 29922 1726853684.97463: Calling all_inventory to load vars for managed_node3 29922 1726853684.97466: Calling groups_inventory to load vars for managed_node3 29922 1726853684.97467: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853684.97478: Calling all_plugins_play to load vars for managed_node3 29922 1726853684.97480: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853684.97483: Calling groups_plugins_play to load vars for managed_node3 29922 1726853684.98279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853684.99169: done with get_vars() 29922 1726853684.99187: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:34:44 -0400 (0:00:00.036) 0:00:33.922 ****** 29922 1726853684.99258: entering _queue_task() for managed_node3/service_facts 29922 1726853684.99504: worker is 1 (out of 1 available) 29922 1726853684.99518: exiting _queue_task() for managed_node3/service_facts 29922 1726853684.99530: done queuing things up, now waiting for results queue to drain 29922 1726853684.99531: waiting for pending results... 29922 1726853684.99709: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 29922 1726853684.99798: in run() - task 02083763-bbaf-51d4-513b-0000000005e3 29922 1726853684.99809: variable 'ansible_search_path' from source: unknown 29922 1726853684.99813: variable 'ansible_search_path' from source: unknown 29922 1726853684.99843: calling self._execute() 29922 1726853684.99921: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853684.99925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853684.99934: variable 'omit' from source: magic vars 29922 1726853685.00207: variable 'ansible_distribution_major_version' from source: facts 29922 1726853685.00217: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853685.00224: variable 'omit' from source: magic vars 29922 1726853685.00266: variable 'omit' from source: magic vars 29922 1726853685.00292: variable 'omit' from source: magic vars 29922 1726853685.00326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853685.00352: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853685.00368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853685.00383: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853685.00392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853685.00475: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853685.00478: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853685.00480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853685.00497: Set connection var ansible_connection to ssh 29922 1726853685.00504: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853685.00511: Set connection var ansible_shell_executable to /bin/sh 29922 1726853685.00523: Set connection var ansible_pipelining to False 29922 1726853685.00526: Set connection var ansible_timeout to 10 29922 1726853685.00528: Set connection var ansible_shell_type to sh 29922 1726853685.00544: variable 'ansible_shell_executable' from source: unknown 29922 1726853685.00548: variable 'ansible_connection' from source: unknown 29922 1726853685.00550: variable 'ansible_module_compression' from source: unknown 29922 1726853685.00553: variable 'ansible_shell_type' from source: unknown 29922 1726853685.00558: variable 'ansible_shell_executable' from source: unknown 29922 1726853685.00560: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853685.00564: variable 'ansible_pipelining' from source: unknown 29922 1726853685.00566: variable 'ansible_timeout' from source: unknown 29922 1726853685.00568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853685.00713: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853685.00722: variable 'omit' from source: magic vars 29922 1726853685.00727: starting attempt loop 29922 1726853685.00730: running the handler 29922 1726853685.00743: _low_level_execute_command(): starting 29922 1726853685.00751: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853685.01254: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853685.01260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853685.01263: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853685.01265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853685.01307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853685.01319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853685.01400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853685.03090: stdout chunk (state=3): >>>/root <<< 29922 1726853685.03183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853685.03212: stderr chunk (state=3): >>><<< 29922 1726853685.03216: stdout chunk (state=3): >>><<< 29922 1726853685.03238: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853685.03250: _low_level_execute_command(): starting 29922 1726853685.03256: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853685.0323713-31552-166969123325246 `" && echo ansible-tmp-1726853685.0323713-31552-166969123325246="` echo /root/.ansible/tmp/ansible-tmp-1726853685.0323713-31552-166969123325246 `" ) && sleep 0' 29922 1726853685.03694: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853685.03698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853685.03707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853685.03757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853685.03762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853685.03764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853685.03821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853685.05763: stdout chunk (state=3): >>>ansible-tmp-1726853685.0323713-31552-166969123325246=/root/.ansible/tmp/ansible-tmp-1726853685.0323713-31552-166969123325246 <<< 29922 1726853685.05865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853685.05892: stderr chunk (state=3): >>><<< 29922 1726853685.05896: stdout chunk (state=3): >>><<< 29922 1726853685.05910: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853685.0323713-31552-166969123325246=/root/.ansible/tmp/ansible-tmp-1726853685.0323713-31552-166969123325246 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853685.05949: variable 'ansible_module_compression' from source: unknown 29922 1726853685.05985: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 29922 1726853685.06017: variable 'ansible_facts' from source: unknown 29922 1726853685.06074: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853685.0323713-31552-166969123325246/AnsiballZ_service_facts.py 29922 1726853685.06175: Sending initial data 29922 1726853685.06178: Sent initial data (162 bytes) 29922 1726853685.06611: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853685.06614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853685.06617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853685.06619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853685.06621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853685.06675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853685.06683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853685.06687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853685.06740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853685.08328: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 29922 1726853685.08331: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853685.08386: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853685.08447: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp1wtb2_kt /root/.ansible/tmp/ansible-tmp-1726853685.0323713-31552-166969123325246/AnsiballZ_service_facts.py <<< 29922 1726853685.08449: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853685.0323713-31552-166969123325246/AnsiballZ_service_facts.py" <<< 29922 1726853685.08502: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp1wtb2_kt" to remote "/root/.ansible/tmp/ansible-tmp-1726853685.0323713-31552-166969123325246/AnsiballZ_service_facts.py" <<< 29922 1726853685.08510: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853685.0323713-31552-166969123325246/AnsiballZ_service_facts.py" <<< 29922 1726853685.09110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853685.09150: stderr chunk (state=3): >>><<< 29922 1726853685.09154: stdout chunk (state=3): >>><<< 29922 1726853685.09217: done transferring module to remote 29922 1726853685.09226: _low_level_execute_command(): starting 29922 1726853685.09230: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853685.0323713-31552-166969123325246/ /root/.ansible/tmp/ansible-tmp-1726853685.0323713-31552-166969123325246/AnsiballZ_service_facts.py && sleep 0' 29922 1726853685.09680: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853685.09684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853685.09686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853685.09688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853685.09694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853685.09741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853685.09745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853685.09749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853685.09808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853685.11646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853685.11673: stderr chunk (state=3): >>><<< 29922 1726853685.11676: stdout chunk (state=3): >>><<< 29922 1726853685.11688: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853685.11692: _low_level_execute_command(): starting 29922 1726853685.11697: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853685.0323713-31552-166969123325246/AnsiballZ_service_facts.py && sleep 0' 29922 1726853685.12135: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853685.12138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853685.12141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853685.12143: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853685.12145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853685.12196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853685.12199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853685.12209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853685.12281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853686.72044: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 29922 1726853686.73492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853686.73506: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 29922 1726853686.73566: stderr chunk (state=3): >>><<< 29922 1726853686.73598: stdout chunk (state=3): >>><<< 29922 1726853686.73626: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853686.76066: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853685.0323713-31552-166969123325246/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853686.76074: _low_level_execute_command(): starting 29922 1726853686.76078: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853685.0323713-31552-166969123325246/ > /dev/null 2>&1 && sleep 0' 29922 1726853686.77263: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853686.77268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853686.77273: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853686.77275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853686.77505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853686.77720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853686.77785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853686.79747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853686.79752: stdout chunk (state=3): >>><<< 29922 1726853686.79754: stderr chunk (state=3): >>><<< 29922 1726853686.79975: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853686.79979: handler run complete 29922 1726853686.80259: variable 'ansible_facts' from source: unknown 29922 1726853686.80577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853686.81911: variable 'ansible_facts' from source: unknown 29922 1726853686.82446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853686.82977: attempt loop complete, returning result 29922 1726853686.82980: _execute() done 29922 1726853686.82982: dumping result to json 29922 1726853686.82984: done dumping result, returning 29922 1726853686.82986: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-51d4-513b-0000000005e3] 29922 1726853686.82988: sending task result for task 02083763-bbaf-51d4-513b-0000000005e3 29922 1726853686.84875: done sending task result for task 02083763-bbaf-51d4-513b-0000000005e3 29922 1726853686.84879: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29922 1726853686.84967: no more pending results, returning what we have 29922 1726853686.84969: results queue empty 29922 1726853686.84972: checking for any_errors_fatal 29922 1726853686.84977: done checking for any_errors_fatal 29922 1726853686.84978: checking for max_fail_percentage 29922 1726853686.84979: done checking for max_fail_percentage 29922 1726853686.84980: checking to see if all hosts have failed and the running result is not ok 29922 1726853686.84981: done checking to see if all hosts have failed 29922 1726853686.84982: getting the remaining hosts for this loop 29922 1726853686.84983: done getting the remaining hosts for this loop 29922 1726853686.84986: getting the next task for host managed_node3 29922 1726853686.84991: done getting next task for host managed_node3 29922 1726853686.84995: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 29922 1726853686.84998: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853686.85008: getting variables 29922 1726853686.85009: in VariableManager get_vars() 29922 1726853686.85040: Calling all_inventory to load vars for managed_node3 29922 1726853686.85043: Calling groups_inventory to load vars for managed_node3 29922 1726853686.85045: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853686.85054: Calling all_plugins_play to load vars for managed_node3 29922 1726853686.85056: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853686.85059: Calling groups_plugins_play to load vars for managed_node3 29922 1726853686.86062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853686.87008: done with get_vars() 29922 1726853686.87030: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:34:46 -0400 (0:00:01.878) 0:00:35.800 ****** 29922 1726853686.87140: entering _queue_task() for managed_node3/package_facts 29922 1726853686.87598: worker is 1 (out of 1 available) 29922 1726853686.87646: exiting _queue_task() for managed_node3/package_facts 29922 1726853686.87660: done queuing things up, now waiting for results queue to drain 29922 1726853686.87662: waiting for pending results... 29922 1726853686.87975: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 29922 1726853686.88122: in run() - task 02083763-bbaf-51d4-513b-0000000005e4 29922 1726853686.88145: variable 'ansible_search_path' from source: unknown 29922 1726853686.88153: variable 'ansible_search_path' from source: unknown 29922 1726853686.88201: calling self._execute() 29922 1726853686.88477: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853686.88481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853686.88484: variable 'omit' from source: magic vars 29922 1726853686.88719: variable 'ansible_distribution_major_version' from source: facts 29922 1726853686.88735: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853686.88749: variable 'omit' from source: magic vars 29922 1726853686.88813: variable 'omit' from source: magic vars 29922 1726853686.88855: variable 'omit' from source: magic vars 29922 1726853686.88889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853686.88915: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853686.88931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853686.88943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853686.88953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853686.88985: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853686.88990: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853686.88992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853686.89056: Set connection var ansible_connection to ssh 29922 1726853686.89065: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853686.89073: Set connection var ansible_shell_executable to /bin/sh 29922 1726853686.89092: Set connection var ansible_pipelining to False 29922 1726853686.89095: Set connection var ansible_timeout to 10 29922 1726853686.89098: Set connection var ansible_shell_type to sh 29922 1726853686.89139: variable 'ansible_shell_executable' from source: unknown 29922 1726853686.89142: variable 'ansible_connection' from source: unknown 29922 1726853686.89144: variable 'ansible_module_compression' from source: unknown 29922 1726853686.89151: variable 'ansible_shell_type' from source: unknown 29922 1726853686.89154: variable 'ansible_shell_executable' from source: unknown 29922 1726853686.89156: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853686.89158: variable 'ansible_pipelining' from source: unknown 29922 1726853686.89160: variable 'ansible_timeout' from source: unknown 29922 1726853686.89162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853686.89302: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853686.89312: variable 'omit' from source: magic vars 29922 1726853686.89315: starting attempt loop 29922 1726853686.89320: running the handler 29922 1726853686.89331: _low_level_execute_command(): starting 29922 1726853686.89338: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853686.89844: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853686.89849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29922 1726853686.89853: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853686.89909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853686.89913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853686.89991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853686.91689: stdout chunk (state=3): >>>/root <<< 29922 1726853686.91785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853686.91811: stderr chunk (state=3): >>><<< 29922 1726853686.91815: stdout chunk (state=3): >>><<< 29922 1726853686.91842: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853686.91857: _low_level_execute_command(): starting 29922 1726853686.91861: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853686.9184172-31604-112419431197353 `" && echo ansible-tmp-1726853686.9184172-31604-112419431197353="` echo /root/.ansible/tmp/ansible-tmp-1726853686.9184172-31604-112419431197353 `" ) && sleep 0' 29922 1726853686.92313: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853686.92316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853686.92319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853686.92321: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853686.92331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853686.92377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853686.92382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853686.92387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853686.92445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853686.94579: stdout chunk (state=3): >>>ansible-tmp-1726853686.9184172-31604-112419431197353=/root/.ansible/tmp/ansible-tmp-1726853686.9184172-31604-112419431197353 <<< 29922 1726853686.94684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853686.94723: stdout chunk (state=3): >>><<< 29922 1726853686.94731: stderr chunk (state=3): >>><<< 29922 1726853686.94748: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853686.9184172-31604-112419431197353=/root/.ansible/tmp/ansible-tmp-1726853686.9184172-31604-112419431197353 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853686.94824: variable 'ansible_module_compression' from source: unknown 29922 1726853686.94927: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 29922 1726853686.94986: variable 'ansible_facts' from source: unknown 29922 1726853686.95217: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853686.9184172-31604-112419431197353/AnsiballZ_package_facts.py 29922 1726853686.95402: Sending initial data 29922 1726853686.95405: Sent initial data (162 bytes) 29922 1726853686.95870: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853686.95877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853686.95880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 29922 1726853686.95883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853686.95885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853686.95887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853686.95937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853686.95940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853686.96011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853686.97669: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853686.97740: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853686.97814: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpx3rgv101 /root/.ansible/tmp/ansible-tmp-1726853686.9184172-31604-112419431197353/AnsiballZ_package_facts.py <<< 29922 1726853686.97818: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853686.9184172-31604-112419431197353/AnsiballZ_package_facts.py" <<< 29922 1726853686.97896: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpx3rgv101" to remote "/root/.ansible/tmp/ansible-tmp-1726853686.9184172-31604-112419431197353/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853686.9184172-31604-112419431197353/AnsiballZ_package_facts.py" <<< 29922 1726853686.99403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853686.99536: stderr chunk (state=3): >>><<< 29922 1726853686.99540: stdout chunk (state=3): >>><<< 29922 1726853686.99542: done transferring module to remote 29922 1726853686.99544: _low_level_execute_command(): starting 29922 1726853686.99547: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853686.9184172-31604-112419431197353/ /root/.ansible/tmp/ansible-tmp-1726853686.9184172-31604-112419431197353/AnsiballZ_package_facts.py && sleep 0' 29922 1726853686.99952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853686.99958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853686.99960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853686.99962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853687.00012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853687.00019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853687.00081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853687.01969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853687.02002: stderr chunk (state=3): >>><<< 29922 1726853687.02005: stdout chunk (state=3): >>><<< 29922 1726853687.02020: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853687.02028: _low_level_execute_command(): starting 29922 1726853687.02031: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853686.9184172-31604-112419431197353/AnsiballZ_package_facts.py && sleep 0' 29922 1726853687.02698: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853687.02702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853687.02704: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853687.02707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853687.02782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853687.02851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853687.47955: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 29922 1726853687.47982: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 29922 1726853687.49479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853687.49484: stdout chunk (state=3): >>><<< 29922 1726853687.49486: stderr chunk (state=3): >>><<< 29922 1726853687.49779: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853687.58276: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853686.9184172-31604-112419431197353/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853687.58306: _low_level_execute_command(): starting 29922 1726853687.58318: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853686.9184172-31604-112419431197353/ > /dev/null 2>&1 && sleep 0' 29922 1726853687.58920: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853687.58934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853687.58950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853687.58972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853687.58991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853687.59003: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853687.59018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853687.59037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853687.59050: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853687.59358: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853687.59388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853687.59483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853687.61553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853687.61557: stdout chunk (state=3): >>><<< 29922 1726853687.61560: stderr chunk (state=3): >>><<< 29922 1726853687.61578: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853687.61589: handler run complete 29922 1726853687.63633: variable 'ansible_facts' from source: unknown 29922 1726853687.64066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853687.65934: variable 'ansible_facts' from source: unknown 29922 1726853687.66361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853687.67049: attempt loop complete, returning result 29922 1726853687.67068: _execute() done 29922 1726853687.67079: dumping result to json 29922 1726853687.67276: done dumping result, returning 29922 1726853687.67291: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-51d4-513b-0000000005e4] 29922 1726853687.67299: sending task result for task 02083763-bbaf-51d4-513b-0000000005e4 29922 1726853687.76015: done sending task result for task 02083763-bbaf-51d4-513b-0000000005e4 29922 1726853687.76019: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29922 1726853687.76144: no more pending results, returning what we have 29922 1726853687.76147: results queue empty 29922 1726853687.76148: checking for any_errors_fatal 29922 1726853687.76151: done checking for any_errors_fatal 29922 1726853687.76152: checking for max_fail_percentage 29922 1726853687.76154: done checking for max_fail_percentage 29922 1726853687.76158: checking to see if all hosts have failed and the running result is not ok 29922 1726853687.76159: done checking to see if all hosts have failed 29922 1726853687.76159: getting the remaining hosts for this loop 29922 1726853687.76161: done getting the remaining hosts for this loop 29922 1726853687.76164: getting the next task for host managed_node3 29922 1726853687.76169: done getting next task for host managed_node3 29922 1726853687.76178: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 29922 1726853687.76180: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853687.76189: getting variables 29922 1726853687.76190: in VariableManager get_vars() 29922 1726853687.76217: Calling all_inventory to load vars for managed_node3 29922 1726853687.76220: Calling groups_inventory to load vars for managed_node3 29922 1726853687.76222: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853687.76228: Calling all_plugins_play to load vars for managed_node3 29922 1726853687.76231: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853687.76235: Calling groups_plugins_play to load vars for managed_node3 29922 1726853687.77902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853687.79611: done with get_vars() 29922 1726853687.79640: done getting variables 29922 1726853687.79700: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:34:47 -0400 (0:00:00.925) 0:00:36.726 ****** 29922 1726853687.79735: entering _queue_task() for managed_node3/debug 29922 1726853687.80294: worker is 1 (out of 1 available) 29922 1726853687.80305: exiting _queue_task() for managed_node3/debug 29922 1726853687.80315: done queuing things up, now waiting for results queue to drain 29922 1726853687.80316: waiting for pending results... 29922 1726853687.80448: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 29922 1726853687.80653: in run() - task 02083763-bbaf-51d4-513b-000000000092 29922 1726853687.80659: variable 'ansible_search_path' from source: unknown 29922 1726853687.80662: variable 'ansible_search_path' from source: unknown 29922 1726853687.80665: calling self._execute() 29922 1726853687.80741: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853687.80761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853687.80781: variable 'omit' from source: magic vars 29922 1726853687.81228: variable 'ansible_distribution_major_version' from source: facts 29922 1726853687.81256: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853687.81268: variable 'omit' from source: magic vars 29922 1726853687.81316: variable 'omit' from source: magic vars 29922 1726853687.81417: variable 'network_provider' from source: set_fact 29922 1726853687.81438: variable 'omit' from source: magic vars 29922 1726853687.81481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853687.81524: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853687.81547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853687.81631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853687.81634: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853687.81636: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853687.81638: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853687.81640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853687.81727: Set connection var ansible_connection to ssh 29922 1726853687.81750: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853687.81768: Set connection var ansible_shell_executable to /bin/sh 29922 1726853687.81784: Set connection var ansible_pipelining to False 29922 1726853687.81793: Set connection var ansible_timeout to 10 29922 1726853687.81799: Set connection var ansible_shell_type to sh 29922 1726853687.81829: variable 'ansible_shell_executable' from source: unknown 29922 1726853687.81839: variable 'ansible_connection' from source: unknown 29922 1726853687.81858: variable 'ansible_module_compression' from source: unknown 29922 1726853687.81877: variable 'ansible_shell_type' from source: unknown 29922 1726853687.81880: variable 'ansible_shell_executable' from source: unknown 29922 1726853687.81882: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853687.81963: variable 'ansible_pipelining' from source: unknown 29922 1726853687.81966: variable 'ansible_timeout' from source: unknown 29922 1726853687.81969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853687.82099: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853687.82118: variable 'omit' from source: magic vars 29922 1726853687.82129: starting attempt loop 29922 1726853687.82136: running the handler 29922 1726853687.82199: handler run complete 29922 1726853687.82232: attempt loop complete, returning result 29922 1726853687.82240: _execute() done 29922 1726853687.82248: dumping result to json 29922 1726853687.82257: done dumping result, returning 29922 1726853687.82287: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-51d4-513b-000000000092] 29922 1726853687.82290: sending task result for task 02083763-bbaf-51d4-513b-000000000092 29922 1726853687.82527: done sending task result for task 02083763-bbaf-51d4-513b-000000000092 29922 1726853687.82530: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 29922 1726853687.82609: no more pending results, returning what we have 29922 1726853687.82613: results queue empty 29922 1726853687.82614: checking for any_errors_fatal 29922 1726853687.82625: done checking for any_errors_fatal 29922 1726853687.82626: checking for max_fail_percentage 29922 1726853687.82628: done checking for max_fail_percentage 29922 1726853687.82629: checking to see if all hosts have failed and the running result is not ok 29922 1726853687.82630: done checking to see if all hosts have failed 29922 1726853687.82631: getting the remaining hosts for this loop 29922 1726853687.82633: done getting the remaining hosts for this loop 29922 1726853687.82638: getting the next task for host managed_node3 29922 1726853687.82644: done getting next task for host managed_node3 29922 1726853687.82649: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 29922 1726853687.82651: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853687.82666: getting variables 29922 1726853687.82668: in VariableManager get_vars() 29922 1726853687.82708: Calling all_inventory to load vars for managed_node3 29922 1726853687.82710: Calling groups_inventory to load vars for managed_node3 29922 1726853687.82830: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853687.82844: Calling all_plugins_play to load vars for managed_node3 29922 1726853687.82847: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853687.82851: Calling groups_plugins_play to load vars for managed_node3 29922 1726853687.84567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853687.86311: done with get_vars() 29922 1726853687.86341: done getting variables 29922 1726853687.86410: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:34:47 -0400 (0:00:00.067) 0:00:36.793 ****** 29922 1726853687.86442: entering _queue_task() for managed_node3/fail 29922 1726853687.86906: worker is 1 (out of 1 available) 29922 1726853687.86919: exiting _queue_task() for managed_node3/fail 29922 1726853687.86930: done queuing things up, now waiting for results queue to drain 29922 1726853687.86931: waiting for pending results... 29922 1726853687.87288: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 29922 1726853687.87293: in run() - task 02083763-bbaf-51d4-513b-000000000093 29922 1726853687.87313: variable 'ansible_search_path' from source: unknown 29922 1726853687.87319: variable 'ansible_search_path' from source: unknown 29922 1726853687.87446: calling self._execute() 29922 1726853687.87495: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853687.87509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853687.87541: variable 'omit' from source: magic vars 29922 1726853687.88106: variable 'ansible_distribution_major_version' from source: facts 29922 1726853687.88109: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853687.88252: variable 'network_state' from source: role '' defaults 29922 1726853687.88274: Evaluated conditional (network_state != {}): False 29922 1726853687.88283: when evaluation is False, skipping this task 29922 1726853687.88290: _execute() done 29922 1726853687.88298: dumping result to json 29922 1726853687.88306: done dumping result, returning 29922 1726853687.88323: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-51d4-513b-000000000093] 29922 1726853687.88338: sending task result for task 02083763-bbaf-51d4-513b-000000000093 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29922 1726853687.88619: no more pending results, returning what we have 29922 1726853687.88623: results queue empty 29922 1726853687.88625: checking for any_errors_fatal 29922 1726853687.88675: done checking for any_errors_fatal 29922 1726853687.88676: checking for max_fail_percentage 29922 1726853687.88678: done checking for max_fail_percentage 29922 1726853687.88679: checking to see if all hosts have failed and the running result is not ok 29922 1726853687.88680: done checking to see if all hosts have failed 29922 1726853687.88681: getting the remaining hosts for this loop 29922 1726853687.88682: done getting the remaining hosts for this loop 29922 1726853687.88687: getting the next task for host managed_node3 29922 1726853687.88694: done getting next task for host managed_node3 29922 1726853687.88698: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 29922 1726853687.88701: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853687.88719: getting variables 29922 1726853687.88721: in VariableManager get_vars() 29922 1726853687.88883: Calling all_inventory to load vars for managed_node3 29922 1726853687.88886: Calling groups_inventory to load vars for managed_node3 29922 1726853687.88889: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853687.88895: done sending task result for task 02083763-bbaf-51d4-513b-000000000093 29922 1726853687.88899: WORKER PROCESS EXITING 29922 1726853687.88910: Calling all_plugins_play to load vars for managed_node3 29922 1726853687.88913: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853687.88916: Calling groups_plugins_play to load vars for managed_node3 29922 1726853687.91427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853687.93096: done with get_vars() 29922 1726853687.93125: done getting variables 29922 1726853687.93210: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:34:47 -0400 (0:00:00.067) 0:00:36.861 ****** 29922 1726853687.93242: entering _queue_task() for managed_node3/fail 29922 1726853687.94409: worker is 1 (out of 1 available) 29922 1726853687.94422: exiting _queue_task() for managed_node3/fail 29922 1726853687.94433: done queuing things up, now waiting for results queue to drain 29922 1726853687.94435: waiting for pending results... 29922 1726853687.94740: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 29922 1726853687.94930: in run() - task 02083763-bbaf-51d4-513b-000000000094 29922 1726853687.94995: variable 'ansible_search_path' from source: unknown 29922 1726853687.94999: variable 'ansible_search_path' from source: unknown 29922 1726853687.95149: calling self._execute() 29922 1726853687.95309: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853687.95316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853687.95328: variable 'omit' from source: magic vars 29922 1726853687.96140: variable 'ansible_distribution_major_version' from source: facts 29922 1726853687.96159: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853687.96531: variable 'network_state' from source: role '' defaults 29922 1726853687.96536: Evaluated conditional (network_state != {}): False 29922 1726853687.96538: when evaluation is False, skipping this task 29922 1726853687.96542: _execute() done 29922 1726853687.96545: dumping result to json 29922 1726853687.96548: done dumping result, returning 29922 1726853687.96551: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-51d4-513b-000000000094] 29922 1726853687.96641: sending task result for task 02083763-bbaf-51d4-513b-000000000094 29922 1726853687.96719: done sending task result for task 02083763-bbaf-51d4-513b-000000000094 29922 1726853687.96723: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29922 1726853687.96782: no more pending results, returning what we have 29922 1726853687.96787: results queue empty 29922 1726853687.96788: checking for any_errors_fatal 29922 1726853687.96799: done checking for any_errors_fatal 29922 1726853687.96800: checking for max_fail_percentage 29922 1726853687.96802: done checking for max_fail_percentage 29922 1726853687.96803: checking to see if all hosts have failed and the running result is not ok 29922 1726853687.96804: done checking to see if all hosts have failed 29922 1726853687.96805: getting the remaining hosts for this loop 29922 1726853687.96806: done getting the remaining hosts for this loop 29922 1726853687.96810: getting the next task for host managed_node3 29922 1726853687.96815: done getting next task for host managed_node3 29922 1726853687.96819: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 29922 1726853687.96822: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853687.96839: getting variables 29922 1726853687.96841: in VariableManager get_vars() 29922 1726853687.96883: Calling all_inventory to load vars for managed_node3 29922 1726853687.96886: Calling groups_inventory to load vars for managed_node3 29922 1726853687.96888: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853687.96899: Calling all_plugins_play to load vars for managed_node3 29922 1726853687.96901: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853687.96903: Calling groups_plugins_play to load vars for managed_node3 29922 1726853687.99332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853688.01462: done with get_vars() 29922 1726853688.01596: done getting variables 29922 1726853688.01660: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:34:48 -0400 (0:00:00.084) 0:00:36.946 ****** 29922 1726853688.01696: entering _queue_task() for managed_node3/fail 29922 1726853688.02339: worker is 1 (out of 1 available) 29922 1726853688.02353: exiting _queue_task() for managed_node3/fail 29922 1726853688.02364: done queuing things up, now waiting for results queue to drain 29922 1726853688.02366: waiting for pending results... 29922 1726853688.02662: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 29922 1726853688.02869: in run() - task 02083763-bbaf-51d4-513b-000000000095 29922 1726853688.02875: variable 'ansible_search_path' from source: unknown 29922 1726853688.02878: variable 'ansible_search_path' from source: unknown 29922 1726853688.02880: calling self._execute() 29922 1726853688.02949: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853688.02962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853688.02985: variable 'omit' from source: magic vars 29922 1726853688.03401: variable 'ansible_distribution_major_version' from source: facts 29922 1726853688.03427: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853688.03611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853688.08879: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853688.08968: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853688.09152: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853688.09340: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853688.09345: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853688.09546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.09667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.09777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.09810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.09949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.10212: variable 'ansible_distribution_major_version' from source: facts 29922 1726853688.10215: Evaluated conditional (ansible_distribution_major_version | int > 9): True 29922 1726853688.10539: variable 'ansible_distribution' from source: facts 29922 1726853688.10543: variable '__network_rh_distros' from source: role '' defaults 29922 1726853688.10545: Evaluated conditional (ansible_distribution in __network_rh_distros): True 29922 1726853688.11081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.11307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.11311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.11314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.11316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.11458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.11550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.11584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.11677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.11762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.11809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.11880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.12072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.12076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.12078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.12751: variable 'network_connections' from source: play vars 29922 1726853688.12789: variable 'profile' from source: play vars 29922 1726853688.13078: variable 'profile' from source: play vars 29922 1726853688.13081: variable 'interface' from source: set_fact 29922 1726853688.13122: variable 'interface' from source: set_fact 29922 1726853688.13136: variable 'network_state' from source: role '' defaults 29922 1726853688.13332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853688.13706: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853688.13817: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853688.13920: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853688.14104: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853688.14133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853688.14203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853688.14346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.14350: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853688.14416: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 29922 1726853688.14611: when evaluation is False, skipping this task 29922 1726853688.14614: _execute() done 29922 1726853688.14616: dumping result to json 29922 1726853688.14618: done dumping result, returning 29922 1726853688.14620: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-51d4-513b-000000000095] 29922 1726853688.14623: sending task result for task 02083763-bbaf-51d4-513b-000000000095 29922 1726853688.14708: done sending task result for task 02083763-bbaf-51d4-513b-000000000095 29922 1726853688.14711: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 29922 1726853688.14767: no more pending results, returning what we have 29922 1726853688.14773: results queue empty 29922 1726853688.14774: checking for any_errors_fatal 29922 1726853688.14782: done checking for any_errors_fatal 29922 1726853688.14783: checking for max_fail_percentage 29922 1726853688.14784: done checking for max_fail_percentage 29922 1726853688.14785: checking to see if all hosts have failed and the running result is not ok 29922 1726853688.14786: done checking to see if all hosts have failed 29922 1726853688.14787: getting the remaining hosts for this loop 29922 1726853688.14788: done getting the remaining hosts for this loop 29922 1726853688.14793: getting the next task for host managed_node3 29922 1726853688.14799: done getting next task for host managed_node3 29922 1726853688.14803: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 29922 1726853688.14805: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853688.14820: getting variables 29922 1726853688.14822: in VariableManager get_vars() 29922 1726853688.14868: Calling all_inventory to load vars for managed_node3 29922 1726853688.15193: Calling groups_inventory to load vars for managed_node3 29922 1726853688.15197: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853688.15209: Calling all_plugins_play to load vars for managed_node3 29922 1726853688.15212: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853688.15215: Calling groups_plugins_play to load vars for managed_node3 29922 1726853688.18415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853688.22157: done with get_vars() 29922 1726853688.22197: done getting variables 29922 1726853688.22417: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:34:48 -0400 (0:00:00.207) 0:00:37.154 ****** 29922 1726853688.22495: entering _queue_task() for managed_node3/dnf 29922 1726853688.23319: worker is 1 (out of 1 available) 29922 1726853688.23332: exiting _queue_task() for managed_node3/dnf 29922 1726853688.23344: done queuing things up, now waiting for results queue to drain 29922 1726853688.23346: waiting for pending results... 29922 1726853688.23982: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 29922 1726853688.24018: in run() - task 02083763-bbaf-51d4-513b-000000000096 29922 1726853688.24178: variable 'ansible_search_path' from source: unknown 29922 1726853688.24183: variable 'ansible_search_path' from source: unknown 29922 1726853688.24187: calling self._execute() 29922 1726853688.24401: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853688.24414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853688.24647: variable 'omit' from source: magic vars 29922 1726853688.25267: variable 'ansible_distribution_major_version' from source: facts 29922 1726853688.25287: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853688.25675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853688.30366: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853688.30431: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853688.30510: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853688.31077: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853688.31081: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853688.31084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.32004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.32207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.32255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.32676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.32679: variable 'ansible_distribution' from source: facts 29922 1726853688.32682: variable 'ansible_distribution_major_version' from source: facts 29922 1726853688.32684: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 29922 1726853688.32750: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853688.33107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.33136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.33167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.33417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.33437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.33487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.33513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.33541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.33590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.33610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.33723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.33750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.33835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.33886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.33906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.34078: variable 'network_connections' from source: play vars 29922 1726853688.34095: variable 'profile' from source: play vars 29922 1726853688.34173: variable 'profile' from source: play vars 29922 1726853688.34239: variable 'interface' from source: set_fact 29922 1726853688.34250: variable 'interface' from source: set_fact 29922 1726853688.34323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853688.34505: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853688.34546: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853688.34585: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853688.34617: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853688.34663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853688.34695: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853688.34777: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.34780: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853688.34808: variable '__network_team_connections_defined' from source: role '' defaults 29922 1726853688.35048: variable 'network_connections' from source: play vars 29922 1726853688.35058: variable 'profile' from source: play vars 29922 1726853688.35125: variable 'profile' from source: play vars 29922 1726853688.35134: variable 'interface' from source: set_fact 29922 1726853688.35196: variable 'interface' from source: set_fact 29922 1726853688.35230: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29922 1726853688.35319: when evaluation is False, skipping this task 29922 1726853688.35322: _execute() done 29922 1726853688.35324: dumping result to json 29922 1726853688.35326: done dumping result, returning 29922 1726853688.35329: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-51d4-513b-000000000096] 29922 1726853688.35331: sending task result for task 02083763-bbaf-51d4-513b-000000000096 29922 1726853688.35400: done sending task result for task 02083763-bbaf-51d4-513b-000000000096 29922 1726853688.35403: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29922 1726853688.35477: no more pending results, returning what we have 29922 1726853688.35481: results queue empty 29922 1726853688.35481: checking for any_errors_fatal 29922 1726853688.35490: done checking for any_errors_fatal 29922 1726853688.35491: checking for max_fail_percentage 29922 1726853688.35493: done checking for max_fail_percentage 29922 1726853688.35494: checking to see if all hosts have failed and the running result is not ok 29922 1726853688.35495: done checking to see if all hosts have failed 29922 1726853688.35495: getting the remaining hosts for this loop 29922 1726853688.35497: done getting the remaining hosts for this loop 29922 1726853688.35501: getting the next task for host managed_node3 29922 1726853688.35506: done getting next task for host managed_node3 29922 1726853688.35510: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 29922 1726853688.35513: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853688.35534: getting variables 29922 1726853688.35536: in VariableManager get_vars() 29922 1726853688.35575: Calling all_inventory to load vars for managed_node3 29922 1726853688.35578: Calling groups_inventory to load vars for managed_node3 29922 1726853688.35585: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853688.35597: Calling all_plugins_play to load vars for managed_node3 29922 1726853688.35600: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853688.35603: Calling groups_plugins_play to load vars for managed_node3 29922 1726853688.37870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853688.39827: done with get_vars() 29922 1726853688.39858: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 29922 1726853688.39939: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:34:48 -0400 (0:00:00.174) 0:00:37.329 ****** 29922 1726853688.39970: entering _queue_task() for managed_node3/yum 29922 1726853688.40328: worker is 1 (out of 1 available) 29922 1726853688.40343: exiting _queue_task() for managed_node3/yum 29922 1726853688.40354: done queuing things up, now waiting for results queue to drain 29922 1726853688.40355: waiting for pending results... 29922 1726853688.40878: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 29922 1726853688.40888: in run() - task 02083763-bbaf-51d4-513b-000000000097 29922 1726853688.40911: variable 'ansible_search_path' from source: unknown 29922 1726853688.40919: variable 'ansible_search_path' from source: unknown 29922 1726853688.40957: calling self._execute() 29922 1726853688.41075: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853688.41092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853688.41177: variable 'omit' from source: magic vars 29922 1726853688.41490: variable 'ansible_distribution_major_version' from source: facts 29922 1726853688.41507: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853688.41686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853688.44869: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853688.44913: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853688.45014: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853688.45123: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853688.45155: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853688.45347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.45437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.45578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.45595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.45649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.45820: variable 'ansible_distribution_major_version' from source: facts 29922 1726853688.45896: Evaluated conditional (ansible_distribution_major_version | int < 8): False 29922 1726853688.46066: when evaluation is False, skipping this task 29922 1726853688.46070: _execute() done 29922 1726853688.46074: dumping result to json 29922 1726853688.46077: done dumping result, returning 29922 1726853688.46080: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-51d4-513b-000000000097] 29922 1726853688.46083: sending task result for task 02083763-bbaf-51d4-513b-000000000097 29922 1726853688.46156: done sending task result for task 02083763-bbaf-51d4-513b-000000000097 29922 1726853688.46159: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 29922 1726853688.46222: no more pending results, returning what we have 29922 1726853688.46226: results queue empty 29922 1726853688.46227: checking for any_errors_fatal 29922 1726853688.46235: done checking for any_errors_fatal 29922 1726853688.46236: checking for max_fail_percentage 29922 1726853688.46237: done checking for max_fail_percentage 29922 1726853688.46238: checking to see if all hosts have failed and the running result is not ok 29922 1726853688.46239: done checking to see if all hosts have failed 29922 1726853688.46240: getting the remaining hosts for this loop 29922 1726853688.46241: done getting the remaining hosts for this loop 29922 1726853688.46246: getting the next task for host managed_node3 29922 1726853688.46251: done getting next task for host managed_node3 29922 1726853688.46255: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 29922 1726853688.46257: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853688.46275: getting variables 29922 1726853688.46277: in VariableManager get_vars() 29922 1726853688.46317: Calling all_inventory to load vars for managed_node3 29922 1726853688.46320: Calling groups_inventory to load vars for managed_node3 29922 1726853688.46323: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853688.46333: Calling all_plugins_play to load vars for managed_node3 29922 1726853688.46336: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853688.46339: Calling groups_plugins_play to load vars for managed_node3 29922 1726853688.48259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853688.50903: done with get_vars() 29922 1726853688.50937: done getting variables 29922 1726853688.50999: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:34:48 -0400 (0:00:00.110) 0:00:37.439 ****** 29922 1726853688.51028: entering _queue_task() for managed_node3/fail 29922 1726853688.51674: worker is 1 (out of 1 available) 29922 1726853688.51684: exiting _queue_task() for managed_node3/fail 29922 1726853688.51696: done queuing things up, now waiting for results queue to drain 29922 1726853688.51697: waiting for pending results... 29922 1726853688.52190: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 29922 1726853688.52195: in run() - task 02083763-bbaf-51d4-513b-000000000098 29922 1726853688.52198: variable 'ansible_search_path' from source: unknown 29922 1726853688.52200: variable 'ansible_search_path' from source: unknown 29922 1726853688.52224: calling self._execute() 29922 1726853688.52337: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853688.52348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853688.52362: variable 'omit' from source: magic vars 29922 1726853688.52755: variable 'ansible_distribution_major_version' from source: facts 29922 1726853688.52776: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853688.52904: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853688.53107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853688.55518: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853688.55592: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853688.55636: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853688.55758: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853688.55761: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853688.55800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.55837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.55876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.55922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.55942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.56001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.56029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.56059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.56112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.56133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.56179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.56211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.56276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.56284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.56309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.56483: variable 'network_connections' from source: play vars 29922 1726853688.56499: variable 'profile' from source: play vars 29922 1726853688.56580: variable 'profile' from source: play vars 29922 1726853688.56630: variable 'interface' from source: set_fact 29922 1726853688.56663: variable 'interface' from source: set_fact 29922 1726853688.56746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853688.56907: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853688.56948: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853688.56987: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853688.57017: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853688.57064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853688.57276: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853688.57279: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.57282: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853688.57284: variable '__network_team_connections_defined' from source: role '' defaults 29922 1726853688.57419: variable 'network_connections' from source: play vars 29922 1726853688.57430: variable 'profile' from source: play vars 29922 1726853688.57494: variable 'profile' from source: play vars 29922 1726853688.57507: variable 'interface' from source: set_fact 29922 1726853688.57566: variable 'interface' from source: set_fact 29922 1726853688.57598: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29922 1726853688.57606: when evaluation is False, skipping this task 29922 1726853688.57617: _execute() done 29922 1726853688.57624: dumping result to json 29922 1726853688.57631: done dumping result, returning 29922 1726853688.57642: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-51d4-513b-000000000098] 29922 1726853688.57662: sending task result for task 02083763-bbaf-51d4-513b-000000000098 29922 1726853688.57978: done sending task result for task 02083763-bbaf-51d4-513b-000000000098 29922 1726853688.57981: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29922 1726853688.58031: no more pending results, returning what we have 29922 1726853688.58034: results queue empty 29922 1726853688.58035: checking for any_errors_fatal 29922 1726853688.58042: done checking for any_errors_fatal 29922 1726853688.58043: checking for max_fail_percentage 29922 1726853688.58044: done checking for max_fail_percentage 29922 1726853688.58045: checking to see if all hosts have failed and the running result is not ok 29922 1726853688.58046: done checking to see if all hosts have failed 29922 1726853688.58047: getting the remaining hosts for this loop 29922 1726853688.58048: done getting the remaining hosts for this loop 29922 1726853688.58051: getting the next task for host managed_node3 29922 1726853688.58056: done getting next task for host managed_node3 29922 1726853688.58060: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 29922 1726853688.58062: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853688.58078: getting variables 29922 1726853688.58080: in VariableManager get_vars() 29922 1726853688.58118: Calling all_inventory to load vars for managed_node3 29922 1726853688.58120: Calling groups_inventory to load vars for managed_node3 29922 1726853688.58122: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853688.58132: Calling all_plugins_play to load vars for managed_node3 29922 1726853688.58135: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853688.58137: Calling groups_plugins_play to load vars for managed_node3 29922 1726853688.59735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853688.61290: done with get_vars() 29922 1726853688.61318: done getting variables 29922 1726853688.61383: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:34:48 -0400 (0:00:00.103) 0:00:37.543 ****** 29922 1726853688.61418: entering _queue_task() for managed_node3/package 29922 1726853688.61767: worker is 1 (out of 1 available) 29922 1726853688.61783: exiting _queue_task() for managed_node3/package 29922 1726853688.61797: done queuing things up, now waiting for results queue to drain 29922 1726853688.61798: waiting for pending results... 29922 1726853688.62087: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 29922 1726853688.62194: in run() - task 02083763-bbaf-51d4-513b-000000000099 29922 1726853688.62221: variable 'ansible_search_path' from source: unknown 29922 1726853688.62229: variable 'ansible_search_path' from source: unknown 29922 1726853688.62269: calling self._execute() 29922 1726853688.62385: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853688.62396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853688.62409: variable 'omit' from source: magic vars 29922 1726853688.62778: variable 'ansible_distribution_major_version' from source: facts 29922 1726853688.62796: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853688.62998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853688.63266: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853688.63320: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853688.63359: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853688.63452: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853688.63575: variable 'network_packages' from source: role '' defaults 29922 1726853688.63721: variable '__network_provider_setup' from source: role '' defaults 29922 1726853688.63724: variable '__network_service_name_default_nm' from source: role '' defaults 29922 1726853688.63775: variable '__network_service_name_default_nm' from source: role '' defaults 29922 1726853688.63790: variable '__network_packages_default_nm' from source: role '' defaults 29922 1726853688.63857: variable '__network_packages_default_nm' from source: role '' defaults 29922 1726853688.64044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853688.66019: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853688.66095: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853688.66146: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853688.66186: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853688.66219: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853688.66317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.66357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.66391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.66438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.66464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.66515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.66542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.66577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.66622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.66642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.66886: variable '__network_packages_default_gobject_packages' from source: role '' defaults 29922 1726853688.67004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.67034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.67064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.67176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.67180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.67228: variable 'ansible_python' from source: facts 29922 1726853688.67259: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 29922 1726853688.67350: variable '__network_wpa_supplicant_required' from source: role '' defaults 29922 1726853688.67434: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29922 1726853688.67583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.67612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.67643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.67757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.67760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.67762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.67795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.67825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.67869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.67896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.68044: variable 'network_connections' from source: play vars 29922 1726853688.68055: variable 'profile' from source: play vars 29922 1726853688.68214: variable 'profile' from source: play vars 29922 1726853688.68218: variable 'interface' from source: set_fact 29922 1726853688.68248: variable 'interface' from source: set_fact 29922 1726853688.68327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853688.68359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853688.68396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.68436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853688.68489: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853688.68789: variable 'network_connections' from source: play vars 29922 1726853688.68799: variable 'profile' from source: play vars 29922 1726853688.68975: variable 'profile' from source: play vars 29922 1726853688.68978: variable 'interface' from source: set_fact 29922 1726853688.68980: variable 'interface' from source: set_fact 29922 1726853688.69006: variable '__network_packages_default_wireless' from source: role '' defaults 29922 1726853688.69084: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853688.69374: variable 'network_connections' from source: play vars 29922 1726853688.69384: variable 'profile' from source: play vars 29922 1726853688.69448: variable 'profile' from source: play vars 29922 1726853688.69456: variable 'interface' from source: set_fact 29922 1726853688.69553: variable 'interface' from source: set_fact 29922 1726853688.69586: variable '__network_packages_default_team' from source: role '' defaults 29922 1726853688.69668: variable '__network_team_connections_defined' from source: role '' defaults 29922 1726853688.69981: variable 'network_connections' from source: play vars 29922 1726853688.69991: variable 'profile' from source: play vars 29922 1726853688.70070: variable 'profile' from source: play vars 29922 1726853688.70075: variable 'interface' from source: set_fact 29922 1726853688.70161: variable 'interface' from source: set_fact 29922 1726853688.70277: variable '__network_service_name_default_initscripts' from source: role '' defaults 29922 1726853688.70293: variable '__network_service_name_default_initscripts' from source: role '' defaults 29922 1726853688.70305: variable '__network_packages_default_initscripts' from source: role '' defaults 29922 1726853688.70365: variable '__network_packages_default_initscripts' from source: role '' defaults 29922 1726853688.70589: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 29922 1726853688.71084: variable 'network_connections' from source: play vars 29922 1726853688.71095: variable 'profile' from source: play vars 29922 1726853688.71158: variable 'profile' from source: play vars 29922 1726853688.71168: variable 'interface' from source: set_fact 29922 1726853688.71234: variable 'interface' from source: set_fact 29922 1726853688.71375: variable 'ansible_distribution' from source: facts 29922 1726853688.71378: variable '__network_rh_distros' from source: role '' defaults 29922 1726853688.71381: variable 'ansible_distribution_major_version' from source: facts 29922 1726853688.71383: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 29922 1726853688.71443: variable 'ansible_distribution' from source: facts 29922 1726853688.71452: variable '__network_rh_distros' from source: role '' defaults 29922 1726853688.71462: variable 'ansible_distribution_major_version' from source: facts 29922 1726853688.71483: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 29922 1726853688.71648: variable 'ansible_distribution' from source: facts 29922 1726853688.71658: variable '__network_rh_distros' from source: role '' defaults 29922 1726853688.71668: variable 'ansible_distribution_major_version' from source: facts 29922 1726853688.71714: variable 'network_provider' from source: set_fact 29922 1726853688.71736: variable 'ansible_facts' from source: unknown 29922 1726853688.72636: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 29922 1726853688.72645: when evaluation is False, skipping this task 29922 1726853688.72652: _execute() done 29922 1726853688.72659: dumping result to json 29922 1726853688.72667: done dumping result, returning 29922 1726853688.72685: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-51d4-513b-000000000099] 29922 1726853688.72776: sending task result for task 02083763-bbaf-51d4-513b-000000000099 29922 1726853688.72850: done sending task result for task 02083763-bbaf-51d4-513b-000000000099 29922 1726853688.72853: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 29922 1726853688.72904: no more pending results, returning what we have 29922 1726853688.72907: results queue empty 29922 1726853688.72908: checking for any_errors_fatal 29922 1726853688.72916: done checking for any_errors_fatal 29922 1726853688.72916: checking for max_fail_percentage 29922 1726853688.72918: done checking for max_fail_percentage 29922 1726853688.72919: checking to see if all hosts have failed and the running result is not ok 29922 1726853688.72920: done checking to see if all hosts have failed 29922 1726853688.72920: getting the remaining hosts for this loop 29922 1726853688.72922: done getting the remaining hosts for this loop 29922 1726853688.72926: getting the next task for host managed_node3 29922 1726853688.72932: done getting next task for host managed_node3 29922 1726853688.72936: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 29922 1726853688.72938: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853688.72952: getting variables 29922 1726853688.72954: in VariableManager get_vars() 29922 1726853688.72995: Calling all_inventory to load vars for managed_node3 29922 1726853688.72997: Calling groups_inventory to load vars for managed_node3 29922 1726853688.73000: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853688.73015: Calling all_plugins_play to load vars for managed_node3 29922 1726853688.73018: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853688.73020: Calling groups_plugins_play to load vars for managed_node3 29922 1726853688.74646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853688.76375: done with get_vars() 29922 1726853688.76400: done getting variables 29922 1726853688.76463: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:34:48 -0400 (0:00:00.150) 0:00:37.694 ****** 29922 1726853688.76498: entering _queue_task() for managed_node3/package 29922 1726853688.76857: worker is 1 (out of 1 available) 29922 1726853688.76976: exiting _queue_task() for managed_node3/package 29922 1726853688.76990: done queuing things up, now waiting for results queue to drain 29922 1726853688.76992: waiting for pending results... 29922 1726853688.77291: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 29922 1726853688.77328: in run() - task 02083763-bbaf-51d4-513b-00000000009a 29922 1726853688.77387: variable 'ansible_search_path' from source: unknown 29922 1726853688.77391: variable 'ansible_search_path' from source: unknown 29922 1726853688.77405: calling self._execute() 29922 1726853688.77526: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853688.77540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853688.77555: variable 'omit' from source: magic vars 29922 1726853688.78037: variable 'ansible_distribution_major_version' from source: facts 29922 1726853688.78040: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853688.78088: variable 'network_state' from source: role '' defaults 29922 1726853688.78106: Evaluated conditional (network_state != {}): False 29922 1726853688.78115: when evaluation is False, skipping this task 29922 1726853688.78123: _execute() done 29922 1726853688.78131: dumping result to json 29922 1726853688.78138: done dumping result, returning 29922 1726853688.78154: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-51d4-513b-00000000009a] 29922 1726853688.78167: sending task result for task 02083763-bbaf-51d4-513b-00000000009a skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29922 1726853688.78319: no more pending results, returning what we have 29922 1726853688.78323: results queue empty 29922 1726853688.78324: checking for any_errors_fatal 29922 1726853688.78333: done checking for any_errors_fatal 29922 1726853688.78334: checking for max_fail_percentage 29922 1726853688.78336: done checking for max_fail_percentage 29922 1726853688.78337: checking to see if all hosts have failed and the running result is not ok 29922 1726853688.78338: done checking to see if all hosts have failed 29922 1726853688.78338: getting the remaining hosts for this loop 29922 1726853688.78339: done getting the remaining hosts for this loop 29922 1726853688.78343: getting the next task for host managed_node3 29922 1726853688.78350: done getting next task for host managed_node3 29922 1726853688.78354: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 29922 1726853688.78356: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853688.78475: getting variables 29922 1726853688.78477: in VariableManager get_vars() 29922 1726853688.78516: Calling all_inventory to load vars for managed_node3 29922 1726853688.78519: Calling groups_inventory to load vars for managed_node3 29922 1726853688.78522: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853688.78535: Calling all_plugins_play to load vars for managed_node3 29922 1726853688.78538: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853688.78542: Calling groups_plugins_play to load vars for managed_node3 29922 1726853688.79287: done sending task result for task 02083763-bbaf-51d4-513b-00000000009a 29922 1726853688.79290: WORKER PROCESS EXITING 29922 1726853688.80118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853688.81678: done with get_vars() 29922 1726853688.81706: done getting variables 29922 1726853688.81767: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:34:48 -0400 (0:00:00.053) 0:00:37.747 ****** 29922 1726853688.81800: entering _queue_task() for managed_node3/package 29922 1726853688.82153: worker is 1 (out of 1 available) 29922 1726853688.82166: exiting _queue_task() for managed_node3/package 29922 1726853688.82282: done queuing things up, now waiting for results queue to drain 29922 1726853688.82284: waiting for pending results... 29922 1726853688.82467: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 29922 1726853688.82585: in run() - task 02083763-bbaf-51d4-513b-00000000009b 29922 1726853688.82604: variable 'ansible_search_path' from source: unknown 29922 1726853688.82614: variable 'ansible_search_path' from source: unknown 29922 1726853688.82658: calling self._execute() 29922 1726853688.82774: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853688.82788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853688.82803: variable 'omit' from source: magic vars 29922 1726853688.83192: variable 'ansible_distribution_major_version' from source: facts 29922 1726853688.83209: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853688.83331: variable 'network_state' from source: role '' defaults 29922 1726853688.83349: Evaluated conditional (network_state != {}): False 29922 1726853688.83357: when evaluation is False, skipping this task 29922 1726853688.83364: _execute() done 29922 1726853688.83373: dumping result to json 29922 1726853688.83384: done dumping result, returning 29922 1726853688.83396: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-51d4-513b-00000000009b] 29922 1726853688.83406: sending task result for task 02083763-bbaf-51d4-513b-00000000009b 29922 1726853688.83618: done sending task result for task 02083763-bbaf-51d4-513b-00000000009b 29922 1726853688.83621: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29922 1726853688.83673: no more pending results, returning what we have 29922 1726853688.83677: results queue empty 29922 1726853688.83678: checking for any_errors_fatal 29922 1726853688.83687: done checking for any_errors_fatal 29922 1726853688.83688: checking for max_fail_percentage 29922 1726853688.83690: done checking for max_fail_percentage 29922 1726853688.83691: checking to see if all hosts have failed and the running result is not ok 29922 1726853688.83692: done checking to see if all hosts have failed 29922 1726853688.83693: getting the remaining hosts for this loop 29922 1726853688.83694: done getting the remaining hosts for this loop 29922 1726853688.83698: getting the next task for host managed_node3 29922 1726853688.83703: done getting next task for host managed_node3 29922 1726853688.83707: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 29922 1726853688.83710: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853688.83728: getting variables 29922 1726853688.83730: in VariableManager get_vars() 29922 1726853688.83772: Calling all_inventory to load vars for managed_node3 29922 1726853688.83775: Calling groups_inventory to load vars for managed_node3 29922 1726853688.83778: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853688.83791: Calling all_plugins_play to load vars for managed_node3 29922 1726853688.83794: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853688.83798: Calling groups_plugins_play to load vars for managed_node3 29922 1726853688.85518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853688.87061: done with get_vars() 29922 1726853688.87090: done getting variables 29922 1726853688.87152: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:34:48 -0400 (0:00:00.053) 0:00:37.801 ****** 29922 1726853688.87187: entering _queue_task() for managed_node3/service 29922 1726853688.87535: worker is 1 (out of 1 available) 29922 1726853688.87548: exiting _queue_task() for managed_node3/service 29922 1726853688.87560: done queuing things up, now waiting for results queue to drain 29922 1726853688.87562: waiting for pending results... 29922 1726853688.87992: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 29922 1726853688.87997: in run() - task 02083763-bbaf-51d4-513b-00000000009c 29922 1726853688.88000: variable 'ansible_search_path' from source: unknown 29922 1726853688.88002: variable 'ansible_search_path' from source: unknown 29922 1726853688.88026: calling self._execute() 29922 1726853688.88136: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853688.88149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853688.88163: variable 'omit' from source: magic vars 29922 1726853688.88535: variable 'ansible_distribution_major_version' from source: facts 29922 1726853688.88555: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853688.88676: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853688.88879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853688.90593: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853688.90637: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853688.90665: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853688.90692: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853688.90714: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853688.90775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.90807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.90826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.90858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.90867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.90903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.90919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.90936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.90963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.90974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.91004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.91019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.91036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.91064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.91076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.91186: variable 'network_connections' from source: play vars 29922 1726853688.91196: variable 'profile' from source: play vars 29922 1726853688.91247: variable 'profile' from source: play vars 29922 1726853688.91250: variable 'interface' from source: set_fact 29922 1726853688.91297: variable 'interface' from source: set_fact 29922 1726853688.91346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853688.91462: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853688.91498: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853688.91525: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853688.91565: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853688.91659: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853688.91662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853688.91665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.91688: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853688.91703: variable '__network_team_connections_defined' from source: role '' defaults 29922 1726853688.91978: variable 'network_connections' from source: play vars 29922 1726853688.91982: variable 'profile' from source: play vars 29922 1726853688.92002: variable 'profile' from source: play vars 29922 1726853688.92005: variable 'interface' from source: set_fact 29922 1726853688.92052: variable 'interface' from source: set_fact 29922 1726853688.92076: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29922 1726853688.92079: when evaluation is False, skipping this task 29922 1726853688.92083: _execute() done 29922 1726853688.92085: dumping result to json 29922 1726853688.92087: done dumping result, returning 29922 1726853688.92099: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-51d4-513b-00000000009c] 29922 1726853688.92109: sending task result for task 02083763-bbaf-51d4-513b-00000000009c 29922 1726853688.92187: done sending task result for task 02083763-bbaf-51d4-513b-00000000009c 29922 1726853688.92190: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29922 1726853688.92240: no more pending results, returning what we have 29922 1726853688.92243: results queue empty 29922 1726853688.92244: checking for any_errors_fatal 29922 1726853688.92252: done checking for any_errors_fatal 29922 1726853688.92253: checking for max_fail_percentage 29922 1726853688.92257: done checking for max_fail_percentage 29922 1726853688.92257: checking to see if all hosts have failed and the running result is not ok 29922 1726853688.92258: done checking to see if all hosts have failed 29922 1726853688.92259: getting the remaining hosts for this loop 29922 1726853688.92260: done getting the remaining hosts for this loop 29922 1726853688.92264: getting the next task for host managed_node3 29922 1726853688.92270: done getting next task for host managed_node3 29922 1726853688.92275: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 29922 1726853688.92277: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853688.92290: getting variables 29922 1726853688.92291: in VariableManager get_vars() 29922 1726853688.92327: Calling all_inventory to load vars for managed_node3 29922 1726853688.92329: Calling groups_inventory to load vars for managed_node3 29922 1726853688.92331: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853688.92340: Calling all_plugins_play to load vars for managed_node3 29922 1726853688.92343: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853688.92345: Calling groups_plugins_play to load vars for managed_node3 29922 1726853688.93451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853688.94346: done with get_vars() 29922 1726853688.94364: done getting variables 29922 1726853688.94409: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:34:48 -0400 (0:00:00.072) 0:00:37.873 ****** 29922 1726853688.94432: entering _queue_task() for managed_node3/service 29922 1726853688.94682: worker is 1 (out of 1 available) 29922 1726853688.94695: exiting _queue_task() for managed_node3/service 29922 1726853688.94707: done queuing things up, now waiting for results queue to drain 29922 1726853688.94708: waiting for pending results... 29922 1726853688.94900: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 29922 1726853688.94975: in run() - task 02083763-bbaf-51d4-513b-00000000009d 29922 1726853688.94987: variable 'ansible_search_path' from source: unknown 29922 1726853688.94991: variable 'ansible_search_path' from source: unknown 29922 1726853688.95019: calling self._execute() 29922 1726853688.95131: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853688.95137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853688.95140: variable 'omit' from source: magic vars 29922 1726853688.95675: variable 'ansible_distribution_major_version' from source: facts 29922 1726853688.95679: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853688.95717: variable 'network_provider' from source: set_fact 29922 1726853688.95728: variable 'network_state' from source: role '' defaults 29922 1726853688.95738: Evaluated conditional (network_provider == "nm" or network_state != {}): True 29922 1726853688.95745: variable 'omit' from source: magic vars 29922 1726853688.95784: variable 'omit' from source: magic vars 29922 1726853688.95812: variable 'network_service_name' from source: role '' defaults 29922 1726853688.95891: variable 'network_service_name' from source: role '' defaults 29922 1726853688.96000: variable '__network_provider_setup' from source: role '' defaults 29922 1726853688.96005: variable '__network_service_name_default_nm' from source: role '' defaults 29922 1726853688.96077: variable '__network_service_name_default_nm' from source: role '' defaults 29922 1726853688.96087: variable '__network_packages_default_nm' from source: role '' defaults 29922 1726853688.96149: variable '__network_packages_default_nm' from source: role '' defaults 29922 1726853688.96391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853688.98264: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853688.98307: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853688.98335: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853688.98362: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853688.98384: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853688.98448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.98472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.98492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.98545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.98557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.98612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.98630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.98653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.98706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.98717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.98930: variable '__network_packages_default_gobject_packages' from source: role '' defaults 29922 1726853688.99176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.99179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.99181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.99183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.99185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.99274: variable 'ansible_python' from source: facts 29922 1726853688.99302: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 29922 1726853688.99399: variable '__network_wpa_supplicant_required' from source: role '' defaults 29922 1726853688.99489: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29922 1726853688.99622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.99652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.99688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.99735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.99758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853688.99804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853688.99844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853688.99882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853688.99928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853688.99952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853689.00059: variable 'network_connections' from source: play vars 29922 1726853689.00064: variable 'profile' from source: play vars 29922 1726853689.00180: variable 'profile' from source: play vars 29922 1726853689.00183: variable 'interface' from source: set_fact 29922 1726853689.00200: variable 'interface' from source: set_fact 29922 1726853689.00296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853689.00615: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853689.00618: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853689.00620: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853689.00623: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853689.00648: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853689.00876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853689.00880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853689.00882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853689.00885: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853689.01056: variable 'network_connections' from source: play vars 29922 1726853689.01066: variable 'profile' from source: play vars 29922 1726853689.01135: variable 'profile' from source: play vars 29922 1726853689.01140: variable 'interface' from source: set_fact 29922 1726853689.01201: variable 'interface' from source: set_fact 29922 1726853689.01233: variable '__network_packages_default_wireless' from source: role '' defaults 29922 1726853689.01312: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853689.01597: variable 'network_connections' from source: play vars 29922 1726853689.01601: variable 'profile' from source: play vars 29922 1726853689.01655: variable 'profile' from source: play vars 29922 1726853689.01661: variable 'interface' from source: set_fact 29922 1726853689.01708: variable 'interface' from source: set_fact 29922 1726853689.01727: variable '__network_packages_default_team' from source: role '' defaults 29922 1726853689.01791: variable '__network_team_connections_defined' from source: role '' defaults 29922 1726853689.01973: variable 'network_connections' from source: play vars 29922 1726853689.01977: variable 'profile' from source: play vars 29922 1726853689.02026: variable 'profile' from source: play vars 29922 1726853689.02030: variable 'interface' from source: set_fact 29922 1726853689.02082: variable 'interface' from source: set_fact 29922 1726853689.02124: variable '__network_service_name_default_initscripts' from source: role '' defaults 29922 1726853689.02166: variable '__network_service_name_default_initscripts' from source: role '' defaults 29922 1726853689.02172: variable '__network_packages_default_initscripts' from source: role '' defaults 29922 1726853689.02215: variable '__network_packages_default_initscripts' from source: role '' defaults 29922 1726853689.02347: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 29922 1726853689.02650: variable 'network_connections' from source: play vars 29922 1726853689.02653: variable 'profile' from source: play vars 29922 1726853689.02699: variable 'profile' from source: play vars 29922 1726853689.02702: variable 'interface' from source: set_fact 29922 1726853689.02750: variable 'interface' from source: set_fact 29922 1726853689.02756: variable 'ansible_distribution' from source: facts 29922 1726853689.02762: variable '__network_rh_distros' from source: role '' defaults 29922 1726853689.02769: variable 'ansible_distribution_major_version' from source: facts 29922 1726853689.02782: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 29922 1726853689.02893: variable 'ansible_distribution' from source: facts 29922 1726853689.02896: variable '__network_rh_distros' from source: role '' defaults 29922 1726853689.02901: variable 'ansible_distribution_major_version' from source: facts 29922 1726853689.02912: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 29922 1726853689.03024: variable 'ansible_distribution' from source: facts 29922 1726853689.03027: variable '__network_rh_distros' from source: role '' defaults 29922 1726853689.03030: variable 'ansible_distribution_major_version' from source: facts 29922 1726853689.03057: variable 'network_provider' from source: set_fact 29922 1726853689.03081: variable 'omit' from source: magic vars 29922 1726853689.03103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853689.03124: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853689.03139: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853689.03151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853689.03162: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853689.03191: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853689.03194: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853689.03197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853689.03262: Set connection var ansible_connection to ssh 29922 1726853689.03268: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853689.03278: Set connection var ansible_shell_executable to /bin/sh 29922 1726853689.03284: Set connection var ansible_pipelining to False 29922 1726853689.03290: Set connection var ansible_timeout to 10 29922 1726853689.03293: Set connection var ansible_shell_type to sh 29922 1726853689.03314: variable 'ansible_shell_executable' from source: unknown 29922 1726853689.03317: variable 'ansible_connection' from source: unknown 29922 1726853689.03319: variable 'ansible_module_compression' from source: unknown 29922 1726853689.03322: variable 'ansible_shell_type' from source: unknown 29922 1726853689.03324: variable 'ansible_shell_executable' from source: unknown 29922 1726853689.03326: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853689.03333: variable 'ansible_pipelining' from source: unknown 29922 1726853689.03335: variable 'ansible_timeout' from source: unknown 29922 1726853689.03337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853689.03411: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853689.03419: variable 'omit' from source: magic vars 29922 1726853689.03425: starting attempt loop 29922 1726853689.03428: running the handler 29922 1726853689.03565: variable 'ansible_facts' from source: unknown 29922 1726853689.04265: _low_level_execute_command(): starting 29922 1726853689.04268: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853689.04830: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853689.04844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.04847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853689.04850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.04902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853689.04905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853689.04979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853689.06681: stdout chunk (state=3): >>>/root <<< 29922 1726853689.06781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853689.06808: stderr chunk (state=3): >>><<< 29922 1726853689.06811: stdout chunk (state=3): >>><<< 29922 1726853689.06829: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853689.06839: _low_level_execute_command(): starting 29922 1726853689.06844: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853689.0682914-31693-11307459767849 `" && echo ansible-tmp-1726853689.0682914-31693-11307459767849="` echo /root/.ansible/tmp/ansible-tmp-1726853689.0682914-31693-11307459767849 `" ) && sleep 0' 29922 1726853689.07291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853689.07294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853689.07297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.07299: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853689.07301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853689.07303: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.07354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853689.07357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853689.07362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853689.07419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853689.09378: stdout chunk (state=3): >>>ansible-tmp-1726853689.0682914-31693-11307459767849=/root/.ansible/tmp/ansible-tmp-1726853689.0682914-31693-11307459767849 <<< 29922 1726853689.09485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853689.09510: stderr chunk (state=3): >>><<< 29922 1726853689.09513: stdout chunk (state=3): >>><<< 29922 1726853689.09525: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853689.0682914-31693-11307459767849=/root/.ansible/tmp/ansible-tmp-1726853689.0682914-31693-11307459767849 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853689.09552: variable 'ansible_module_compression' from source: unknown 29922 1726853689.09602: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 29922 1726853689.09655: variable 'ansible_facts' from source: unknown 29922 1726853689.09799: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853689.0682914-31693-11307459767849/AnsiballZ_systemd.py 29922 1726853689.09910: Sending initial data 29922 1726853689.09913: Sent initial data (155 bytes) 29922 1726853689.10349: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853689.10370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853689.10376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.10389: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.10441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853689.10444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853689.10450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853689.10505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853689.12115: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853689.12173: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853689.12226: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpt2ao1rqt /root/.ansible/tmp/ansible-tmp-1726853689.0682914-31693-11307459767849/AnsiballZ_systemd.py <<< 29922 1726853689.12235: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853689.0682914-31693-11307459767849/AnsiballZ_systemd.py" <<< 29922 1726853689.12284: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpt2ao1rqt" to remote "/root/.ansible/tmp/ansible-tmp-1726853689.0682914-31693-11307459767849/AnsiballZ_systemd.py" <<< 29922 1726853689.12290: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853689.0682914-31693-11307459767849/AnsiballZ_systemd.py" <<< 29922 1726853689.13434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853689.13479: stderr chunk (state=3): >>><<< 29922 1726853689.13482: stdout chunk (state=3): >>><<< 29922 1726853689.13509: done transferring module to remote 29922 1726853689.13518: _low_level_execute_command(): starting 29922 1726853689.13523: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853689.0682914-31693-11307459767849/ /root/.ansible/tmp/ansible-tmp-1726853689.0682914-31693-11307459767849/AnsiballZ_systemd.py && sleep 0' 29922 1726853689.13952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853689.13987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853689.13990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853689.13992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29922 1726853689.13994: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853689.13996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.14045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853689.14048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853689.14107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853689.15949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853689.15980: stderr chunk (state=3): >>><<< 29922 1726853689.15983: stdout chunk (state=3): >>><<< 29922 1726853689.15998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853689.16001: _low_level_execute_command(): starting 29922 1726853689.16006: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853689.0682914-31693-11307459767849/AnsiballZ_systemd.py && sleep 0' 29922 1726853689.16429: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853689.16466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853689.16469: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853689.16479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.16482: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853689.16484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853689.16486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.16526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853689.16529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853689.16535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853689.16600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853689.46226: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10596352", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3325468672", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "2137339000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 29922 1726853689.46242: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysini<<< 29922 1726853689.46261: stdout chunk (state=3): >>>t.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 29922 1726853689.48295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853689.48325: stderr chunk (state=3): >>><<< 29922 1726853689.48329: stdout chunk (state=3): >>><<< 29922 1726853689.48344: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10596352", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3325468672", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "2137339000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853689.48461: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853689.0682914-31693-11307459767849/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853689.48478: _low_level_execute_command(): starting 29922 1726853689.48482: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853689.0682914-31693-11307459767849/ > /dev/null 2>&1 && sleep 0' 29922 1726853689.48937: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853689.48940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853689.48942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.48944: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853689.48946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.49003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853689.49007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853689.49013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853689.49076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853689.50952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853689.50978: stderr chunk (state=3): >>><<< 29922 1726853689.50982: stdout chunk (state=3): >>><<< 29922 1726853689.50994: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853689.51003: handler run complete 29922 1726853689.51042: attempt loop complete, returning result 29922 1726853689.51045: _execute() done 29922 1726853689.51048: dumping result to json 29922 1726853689.51062: done dumping result, returning 29922 1726853689.51072: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-51d4-513b-00000000009d] 29922 1726853689.51076: sending task result for task 02083763-bbaf-51d4-513b-00000000009d 29922 1726853689.51267: done sending task result for task 02083763-bbaf-51d4-513b-00000000009d 29922 1726853689.51272: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29922 1726853689.51323: no more pending results, returning what we have 29922 1726853689.51326: results queue empty 29922 1726853689.51327: checking for any_errors_fatal 29922 1726853689.51332: done checking for any_errors_fatal 29922 1726853689.51332: checking for max_fail_percentage 29922 1726853689.51335: done checking for max_fail_percentage 29922 1726853689.51336: checking to see if all hosts have failed and the running result is not ok 29922 1726853689.51336: done checking to see if all hosts have failed 29922 1726853689.51337: getting the remaining hosts for this loop 29922 1726853689.51338: done getting the remaining hosts for this loop 29922 1726853689.51341: getting the next task for host managed_node3 29922 1726853689.51347: done getting next task for host managed_node3 29922 1726853689.51350: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 29922 1726853689.51352: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853689.51361: getting variables 29922 1726853689.51363: in VariableManager get_vars() 29922 1726853689.51433: Calling all_inventory to load vars for managed_node3 29922 1726853689.51436: Calling groups_inventory to load vars for managed_node3 29922 1726853689.51438: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853689.51448: Calling all_plugins_play to load vars for managed_node3 29922 1726853689.51450: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853689.51453: Calling groups_plugins_play to load vars for managed_node3 29922 1726853689.52555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853689.54135: done with get_vars() 29922 1726853689.54167: done getting variables 29922 1726853689.54237: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:34:49 -0400 (0:00:00.598) 0:00:38.472 ****** 29922 1726853689.54272: entering _queue_task() for managed_node3/service 29922 1726853689.54642: worker is 1 (out of 1 available) 29922 1726853689.54656: exiting _queue_task() for managed_node3/service 29922 1726853689.54669: done queuing things up, now waiting for results queue to drain 29922 1726853689.54873: waiting for pending results... 29922 1726853689.55090: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 29922 1726853689.55101: in run() - task 02083763-bbaf-51d4-513b-00000000009e 29922 1726853689.55104: variable 'ansible_search_path' from source: unknown 29922 1726853689.55107: variable 'ansible_search_path' from source: unknown 29922 1726853689.55148: calling self._execute() 29922 1726853689.55260: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853689.55319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853689.55323: variable 'omit' from source: magic vars 29922 1726853689.55695: variable 'ansible_distribution_major_version' from source: facts 29922 1726853689.55714: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853689.55845: variable 'network_provider' from source: set_fact 29922 1726853689.55863: Evaluated conditional (network_provider == "nm"): True 29922 1726853689.55960: variable '__network_wpa_supplicant_required' from source: role '' defaults 29922 1726853689.56079: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29922 1726853689.56249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853689.57817: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853689.57873: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853689.57900: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853689.57924: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853689.57949: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853689.58020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853689.58047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853689.58064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853689.58092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853689.58103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853689.58134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853689.58155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853689.58174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853689.58198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853689.58209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853689.58236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853689.58252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853689.58276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853689.58300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853689.58310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853689.58407: variable 'network_connections' from source: play vars 29922 1726853689.58417: variable 'profile' from source: play vars 29922 1726853689.58469: variable 'profile' from source: play vars 29922 1726853689.58474: variable 'interface' from source: set_fact 29922 1726853689.58519: variable 'interface' from source: set_fact 29922 1726853689.58569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29922 1726853689.58679: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29922 1726853689.58706: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29922 1726853689.58729: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29922 1726853689.58750: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29922 1726853689.58784: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29922 1726853689.58799: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29922 1726853689.58819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853689.58836: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29922 1726853689.58875: variable '__network_wireless_connections_defined' from source: role '' defaults 29922 1726853689.59030: variable 'network_connections' from source: play vars 29922 1726853689.59039: variable 'profile' from source: play vars 29922 1726853689.59084: variable 'profile' from source: play vars 29922 1726853689.59087: variable 'interface' from source: set_fact 29922 1726853689.59130: variable 'interface' from source: set_fact 29922 1726853689.59157: Evaluated conditional (__network_wpa_supplicant_required): False 29922 1726853689.59161: when evaluation is False, skipping this task 29922 1726853689.59163: _execute() done 29922 1726853689.59177: dumping result to json 29922 1726853689.59179: done dumping result, returning 29922 1726853689.59182: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-51d4-513b-00000000009e] 29922 1726853689.59184: sending task result for task 02083763-bbaf-51d4-513b-00000000009e 29922 1726853689.59266: done sending task result for task 02083763-bbaf-51d4-513b-00000000009e 29922 1726853689.59269: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 29922 1726853689.59317: no more pending results, returning what we have 29922 1726853689.59320: results queue empty 29922 1726853689.59321: checking for any_errors_fatal 29922 1726853689.59341: done checking for any_errors_fatal 29922 1726853689.59342: checking for max_fail_percentage 29922 1726853689.59344: done checking for max_fail_percentage 29922 1726853689.59345: checking to see if all hosts have failed and the running result is not ok 29922 1726853689.59345: done checking to see if all hosts have failed 29922 1726853689.59346: getting the remaining hosts for this loop 29922 1726853689.59347: done getting the remaining hosts for this loop 29922 1726853689.59351: getting the next task for host managed_node3 29922 1726853689.59358: done getting next task for host managed_node3 29922 1726853689.59362: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 29922 1726853689.59364: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853689.59381: getting variables 29922 1726853689.59383: in VariableManager get_vars() 29922 1726853689.59417: Calling all_inventory to load vars for managed_node3 29922 1726853689.59420: Calling groups_inventory to load vars for managed_node3 29922 1726853689.59422: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853689.59432: Calling all_plugins_play to load vars for managed_node3 29922 1726853689.59435: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853689.59437: Calling groups_plugins_play to load vars for managed_node3 29922 1726853689.60274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853689.61275: done with get_vars() 29922 1726853689.61291: done getting variables 29922 1726853689.61336: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:34:49 -0400 (0:00:00.070) 0:00:38.543 ****** 29922 1726853689.61360: entering _queue_task() for managed_node3/service 29922 1726853689.61618: worker is 1 (out of 1 available) 29922 1726853689.61632: exiting _queue_task() for managed_node3/service 29922 1726853689.61644: done queuing things up, now waiting for results queue to drain 29922 1726853689.61645: waiting for pending results... 29922 1726853689.61822: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 29922 1726853689.61892: in run() - task 02083763-bbaf-51d4-513b-00000000009f 29922 1726853689.61903: variable 'ansible_search_path' from source: unknown 29922 1726853689.61905: variable 'ansible_search_path' from source: unknown 29922 1726853689.61934: calling self._execute() 29922 1726853689.62014: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853689.62020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853689.62029: variable 'omit' from source: magic vars 29922 1726853689.62313: variable 'ansible_distribution_major_version' from source: facts 29922 1726853689.62319: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853689.62399: variable 'network_provider' from source: set_fact 29922 1726853689.62403: Evaluated conditional (network_provider == "initscripts"): False 29922 1726853689.62408: when evaluation is False, skipping this task 29922 1726853689.62413: _execute() done 29922 1726853689.62415: dumping result to json 29922 1726853689.62418: done dumping result, returning 29922 1726853689.62421: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-51d4-513b-00000000009f] 29922 1726853689.62431: sending task result for task 02083763-bbaf-51d4-513b-00000000009f 29922 1726853689.62510: done sending task result for task 02083763-bbaf-51d4-513b-00000000009f 29922 1726853689.62513: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29922 1726853689.62575: no more pending results, returning what we have 29922 1726853689.62579: results queue empty 29922 1726853689.62580: checking for any_errors_fatal 29922 1726853689.62588: done checking for any_errors_fatal 29922 1726853689.62588: checking for max_fail_percentage 29922 1726853689.62590: done checking for max_fail_percentage 29922 1726853689.62591: checking to see if all hosts have failed and the running result is not ok 29922 1726853689.62592: done checking to see if all hosts have failed 29922 1726853689.62592: getting the remaining hosts for this loop 29922 1726853689.62594: done getting the remaining hosts for this loop 29922 1726853689.62597: getting the next task for host managed_node3 29922 1726853689.62601: done getting next task for host managed_node3 29922 1726853689.62605: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 29922 1726853689.62608: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853689.62623: getting variables 29922 1726853689.62625: in VariableManager get_vars() 29922 1726853689.62659: Calling all_inventory to load vars for managed_node3 29922 1726853689.62661: Calling groups_inventory to load vars for managed_node3 29922 1726853689.62663: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853689.62674: Calling all_plugins_play to load vars for managed_node3 29922 1726853689.62676: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853689.62679: Calling groups_plugins_play to load vars for managed_node3 29922 1726853689.63474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853689.64345: done with get_vars() 29922 1726853689.64360: done getting variables 29922 1726853689.64426: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:34:49 -0400 (0:00:00.030) 0:00:38.574 ****** 29922 1726853689.64453: entering _queue_task() for managed_node3/copy 29922 1726853689.64750: worker is 1 (out of 1 available) 29922 1726853689.64763: exiting _queue_task() for managed_node3/copy 29922 1726853689.64978: done queuing things up, now waiting for results queue to drain 29922 1726853689.64980: waiting for pending results... 29922 1726853689.65108: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 29922 1726853689.65163: in run() - task 02083763-bbaf-51d4-513b-0000000000a0 29922 1726853689.65186: variable 'ansible_search_path' from source: unknown 29922 1726853689.65194: variable 'ansible_search_path' from source: unknown 29922 1726853689.65239: calling self._execute() 29922 1726853689.65346: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853689.65359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853689.65378: variable 'omit' from source: magic vars 29922 1726853689.65684: variable 'ansible_distribution_major_version' from source: facts 29922 1726853689.65693: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853689.65770: variable 'network_provider' from source: set_fact 29922 1726853689.65776: Evaluated conditional (network_provider == "initscripts"): False 29922 1726853689.65779: when evaluation is False, skipping this task 29922 1726853689.65782: _execute() done 29922 1726853689.65790: dumping result to json 29922 1726853689.65793: done dumping result, returning 29922 1726853689.65797: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-51d4-513b-0000000000a0] 29922 1726853689.65800: sending task result for task 02083763-bbaf-51d4-513b-0000000000a0 29922 1726853689.65888: done sending task result for task 02083763-bbaf-51d4-513b-0000000000a0 29922 1726853689.65892: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 29922 1726853689.65936: no more pending results, returning what we have 29922 1726853689.65940: results queue empty 29922 1726853689.65941: checking for any_errors_fatal 29922 1726853689.65949: done checking for any_errors_fatal 29922 1726853689.65950: checking for max_fail_percentage 29922 1726853689.65951: done checking for max_fail_percentage 29922 1726853689.65952: checking to see if all hosts have failed and the running result is not ok 29922 1726853689.65953: done checking to see if all hosts have failed 29922 1726853689.65953: getting the remaining hosts for this loop 29922 1726853689.65957: done getting the remaining hosts for this loop 29922 1726853689.65960: getting the next task for host managed_node3 29922 1726853689.65966: done getting next task for host managed_node3 29922 1726853689.65969: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 29922 1726853689.65973: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853689.65987: getting variables 29922 1726853689.65988: in VariableManager get_vars() 29922 1726853689.66020: Calling all_inventory to load vars for managed_node3 29922 1726853689.66022: Calling groups_inventory to load vars for managed_node3 29922 1726853689.66024: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853689.66032: Calling all_plugins_play to load vars for managed_node3 29922 1726853689.66035: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853689.66037: Calling groups_plugins_play to load vars for managed_node3 29922 1726853689.66968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853689.68361: done with get_vars() 29922 1726853689.68392: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:34:49 -0400 (0:00:00.040) 0:00:38.614 ****** 29922 1726853689.68469: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 29922 1726853689.68826: worker is 1 (out of 1 available) 29922 1726853689.68840: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 29922 1726853689.68853: done queuing things up, now waiting for results queue to drain 29922 1726853689.68857: waiting for pending results... 29922 1726853689.69198: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 29922 1726853689.69477: in run() - task 02083763-bbaf-51d4-513b-0000000000a1 29922 1726853689.69480: variable 'ansible_search_path' from source: unknown 29922 1726853689.69483: variable 'ansible_search_path' from source: unknown 29922 1726853689.69486: calling self._execute() 29922 1726853689.69488: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853689.69491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853689.69494: variable 'omit' from source: magic vars 29922 1726853689.69893: variable 'ansible_distribution_major_version' from source: facts 29922 1726853689.69910: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853689.69923: variable 'omit' from source: magic vars 29922 1726853689.69973: variable 'omit' from source: magic vars 29922 1726853689.70141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29922 1726853689.71758: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29922 1726853689.71808: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29922 1726853689.71836: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29922 1726853689.71864: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29922 1726853689.71889: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29922 1726853689.71942: variable 'network_provider' from source: set_fact 29922 1726853689.72041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29922 1726853689.72076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29922 1726853689.72094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29922 1726853689.72123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29922 1726853689.72134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29922 1726853689.72189: variable 'omit' from source: magic vars 29922 1726853689.72268: variable 'omit' from source: magic vars 29922 1726853689.72357: variable 'network_connections' from source: play vars 29922 1726853689.72475: variable 'profile' from source: play vars 29922 1726853689.72478: variable 'profile' from source: play vars 29922 1726853689.72481: variable 'interface' from source: set_fact 29922 1726853689.72506: variable 'interface' from source: set_fact 29922 1726853689.72651: variable 'omit' from source: magic vars 29922 1726853689.72668: variable '__lsr_ansible_managed' from source: task vars 29922 1726853689.72732: variable '__lsr_ansible_managed' from source: task vars 29922 1726853689.73001: Loaded config def from plugin (lookup/template) 29922 1726853689.73011: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 29922 1726853689.73042: File lookup term: get_ansible_managed.j2 29922 1726853689.73050: variable 'ansible_search_path' from source: unknown 29922 1726853689.73063: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 29922 1726853689.73083: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 29922 1726853689.73106: variable 'ansible_search_path' from source: unknown 29922 1726853689.81813: variable 'ansible_managed' from source: unknown 29922 1726853689.81943: variable 'omit' from source: magic vars 29922 1726853689.81977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853689.82005: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853689.82026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853689.82047: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853689.82276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853689.82279: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853689.82281: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853689.82283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853689.82285: Set connection var ansible_connection to ssh 29922 1726853689.82287: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853689.82288: Set connection var ansible_shell_executable to /bin/sh 29922 1726853689.82290: Set connection var ansible_pipelining to False 29922 1726853689.82292: Set connection var ansible_timeout to 10 29922 1726853689.82294: Set connection var ansible_shell_type to sh 29922 1726853689.82297: variable 'ansible_shell_executable' from source: unknown 29922 1726853689.82299: variable 'ansible_connection' from source: unknown 29922 1726853689.82301: variable 'ansible_module_compression' from source: unknown 29922 1726853689.82302: variable 'ansible_shell_type' from source: unknown 29922 1726853689.82304: variable 'ansible_shell_executable' from source: unknown 29922 1726853689.82306: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853689.82308: variable 'ansible_pipelining' from source: unknown 29922 1726853689.82310: variable 'ansible_timeout' from source: unknown 29922 1726853689.82312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853689.82397: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853689.82423: variable 'omit' from source: magic vars 29922 1726853689.82435: starting attempt loop 29922 1726853689.82442: running the handler 29922 1726853689.82455: _low_level_execute_command(): starting 29922 1726853689.82465: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853689.82991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853689.83007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853689.83019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.83062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853689.83086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853689.83145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853689.84852: stdout chunk (state=3): >>>/root <<< 29922 1726853689.84947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853689.84987: stderr chunk (state=3): >>><<< 29922 1726853689.84989: stdout chunk (state=3): >>><<< 29922 1726853689.85007: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853689.85078: _low_level_execute_command(): starting 29922 1726853689.85082: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853689.8501225-31717-167304676324090 `" && echo ansible-tmp-1726853689.8501225-31717-167304676324090="` echo /root/.ansible/tmp/ansible-tmp-1726853689.8501225-31717-167304676324090 `" ) && sleep 0' 29922 1726853689.85476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853689.85479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853689.85482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.85484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853689.85486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.85534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853689.85540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853689.85543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853689.85602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853689.87566: stdout chunk (state=3): >>>ansible-tmp-1726853689.8501225-31717-167304676324090=/root/.ansible/tmp/ansible-tmp-1726853689.8501225-31717-167304676324090 <<< 29922 1726853689.87676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853689.87707: stderr chunk (state=3): >>><<< 29922 1726853689.87710: stdout chunk (state=3): >>><<< 29922 1726853689.87728: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853689.8501225-31717-167304676324090=/root/.ansible/tmp/ansible-tmp-1726853689.8501225-31717-167304676324090 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853689.87765: variable 'ansible_module_compression' from source: unknown 29922 1726853689.87798: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 29922 1726853689.87836: variable 'ansible_facts' from source: unknown 29922 1726853689.87924: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853689.8501225-31717-167304676324090/AnsiballZ_network_connections.py 29922 1726853689.88026: Sending initial data 29922 1726853689.88029: Sent initial data (168 bytes) 29922 1726853689.88453: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853689.88460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853689.88493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.88497: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853689.88499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.88553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853689.88558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853689.88560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853689.88620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853689.90243: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 29922 1726853689.90250: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853689.90300: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853689.90358: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp3qtuhys9 /root/.ansible/tmp/ansible-tmp-1726853689.8501225-31717-167304676324090/AnsiballZ_network_connections.py <<< 29922 1726853689.90365: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853689.8501225-31717-167304676324090/AnsiballZ_network_connections.py" <<< 29922 1726853689.90417: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp3qtuhys9" to remote "/root/.ansible/tmp/ansible-tmp-1726853689.8501225-31717-167304676324090/AnsiballZ_network_connections.py" <<< 29922 1726853689.90420: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853689.8501225-31717-167304676324090/AnsiballZ_network_connections.py" <<< 29922 1726853689.91197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853689.91238: stderr chunk (state=3): >>><<< 29922 1726853689.91241: stdout chunk (state=3): >>><<< 29922 1726853689.91269: done transferring module to remote 29922 1726853689.91279: _low_level_execute_command(): starting 29922 1726853689.91284: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853689.8501225-31717-167304676324090/ /root/.ansible/tmp/ansible-tmp-1726853689.8501225-31717-167304676324090/AnsiballZ_network_connections.py && sleep 0' 29922 1726853689.91736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853689.91739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853689.91742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.91744: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853689.91746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.91801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853689.91806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853689.91809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853689.91862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853689.93697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853689.93724: stderr chunk (state=3): >>><<< 29922 1726853689.93727: stdout chunk (state=3): >>><<< 29922 1726853689.93744: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853689.93747: _low_level_execute_command(): starting 29922 1726853689.93752: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853689.8501225-31717-167304676324090/AnsiballZ_network_connections.py && sleep 0' 29922 1726853689.94176: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853689.94209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853689.94212: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853689.94215: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853689.94221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853689.94267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853689.94275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853689.94338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853690.22287: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4fbtv10l/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4fbtv10l/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/0bb5d45f-2ba9-4c52-9232-d0cdb613594b: error=unknown <<< 29922 1726853690.22447: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 29922 1726853690.24347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853690.24380: stderr chunk (state=3): >>><<< 29922 1726853690.24383: stdout chunk (state=3): >>><<< 29922 1726853690.24401: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4fbtv10l/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4fbtv10l/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/0bb5d45f-2ba9-4c52-9232-d0cdb613594b: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853690.24427: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853689.8501225-31717-167304676324090/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853690.24434: _low_level_execute_command(): starting 29922 1726853690.24439: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853689.8501225-31717-167304676324090/ > /dev/null 2>&1 && sleep 0' 29922 1726853690.24867: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853690.24898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853690.24901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853690.24904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853690.24906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853690.24908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853690.24962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853690.24967: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853690.24968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853690.25026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853690.26904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853690.26930: stderr chunk (state=3): >>><<< 29922 1726853690.26933: stdout chunk (state=3): >>><<< 29922 1726853690.26948: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853690.26959: handler run complete 29922 1726853690.26980: attempt loop complete, returning result 29922 1726853690.26983: _execute() done 29922 1726853690.26985: dumping result to json 29922 1726853690.26990: done dumping result, returning 29922 1726853690.26997: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-51d4-513b-0000000000a1] 29922 1726853690.26999: sending task result for task 02083763-bbaf-51d4-513b-0000000000a1 29922 1726853690.27093: done sending task result for task 02083763-bbaf-51d4-513b-0000000000a1 29922 1726853690.27096: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 29922 1726853690.27178: no more pending results, returning what we have 29922 1726853690.27181: results queue empty 29922 1726853690.27182: checking for any_errors_fatal 29922 1726853690.27187: done checking for any_errors_fatal 29922 1726853690.27187: checking for max_fail_percentage 29922 1726853690.27189: done checking for max_fail_percentage 29922 1726853690.27190: checking to see if all hosts have failed and the running result is not ok 29922 1726853690.27190: done checking to see if all hosts have failed 29922 1726853690.27191: getting the remaining hosts for this loop 29922 1726853690.27193: done getting the remaining hosts for this loop 29922 1726853690.27196: getting the next task for host managed_node3 29922 1726853690.27201: done getting next task for host managed_node3 29922 1726853690.27204: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 29922 1726853690.27207: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853690.27216: getting variables 29922 1726853690.27217: in VariableManager get_vars() 29922 1726853690.27250: Calling all_inventory to load vars for managed_node3 29922 1726853690.27253: Calling groups_inventory to load vars for managed_node3 29922 1726853690.27255: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853690.27263: Calling all_plugins_play to load vars for managed_node3 29922 1726853690.27265: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853690.27268: Calling groups_plugins_play to load vars for managed_node3 29922 1726853690.28140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853690.29003: done with get_vars() 29922 1726853690.29020: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:34:50 -0400 (0:00:00.606) 0:00:39.220 ****** 29922 1726853690.29081: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 29922 1726853690.29321: worker is 1 (out of 1 available) 29922 1726853690.29335: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 29922 1726853690.29348: done queuing things up, now waiting for results queue to drain 29922 1726853690.29349: waiting for pending results... 29922 1726853690.29525: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 29922 1726853690.29605: in run() - task 02083763-bbaf-51d4-513b-0000000000a2 29922 1726853690.29616: variable 'ansible_search_path' from source: unknown 29922 1726853690.29620: variable 'ansible_search_path' from source: unknown 29922 1726853690.29649: calling self._execute() 29922 1726853690.29730: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853690.29737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853690.29745: variable 'omit' from source: magic vars 29922 1726853690.30020: variable 'ansible_distribution_major_version' from source: facts 29922 1726853690.30030: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853690.30115: variable 'network_state' from source: role '' defaults 29922 1726853690.30124: Evaluated conditional (network_state != {}): False 29922 1726853690.30127: when evaluation is False, skipping this task 29922 1726853690.30129: _execute() done 29922 1726853690.30132: dumping result to json 29922 1726853690.30134: done dumping result, returning 29922 1726853690.30139: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-51d4-513b-0000000000a2] 29922 1726853690.30144: sending task result for task 02083763-bbaf-51d4-513b-0000000000a2 29922 1726853690.30227: done sending task result for task 02083763-bbaf-51d4-513b-0000000000a2 29922 1726853690.30230: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29922 1726853690.30280: no more pending results, returning what we have 29922 1726853690.30285: results queue empty 29922 1726853690.30285: checking for any_errors_fatal 29922 1726853690.30293: done checking for any_errors_fatal 29922 1726853690.30294: checking for max_fail_percentage 29922 1726853690.30295: done checking for max_fail_percentage 29922 1726853690.30296: checking to see if all hosts have failed and the running result is not ok 29922 1726853690.30297: done checking to see if all hosts have failed 29922 1726853690.30298: getting the remaining hosts for this loop 29922 1726853690.30299: done getting the remaining hosts for this loop 29922 1726853690.30303: getting the next task for host managed_node3 29922 1726853690.30308: done getting next task for host managed_node3 29922 1726853690.30312: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 29922 1726853690.30315: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853690.30329: getting variables 29922 1726853690.30331: in VariableManager get_vars() 29922 1726853690.30364: Calling all_inventory to load vars for managed_node3 29922 1726853690.30366: Calling groups_inventory to load vars for managed_node3 29922 1726853690.30368: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853690.30379: Calling all_plugins_play to load vars for managed_node3 29922 1726853690.30381: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853690.30384: Calling groups_plugins_play to load vars for managed_node3 29922 1726853690.34466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853690.35372: done with get_vars() 29922 1726853690.35389: done getting variables 29922 1726853690.35423: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:34:50 -0400 (0:00:00.063) 0:00:39.283 ****** 29922 1726853690.35442: entering _queue_task() for managed_node3/debug 29922 1726853690.35701: worker is 1 (out of 1 available) 29922 1726853690.35715: exiting _queue_task() for managed_node3/debug 29922 1726853690.35727: done queuing things up, now waiting for results queue to drain 29922 1726853690.35729: waiting for pending results... 29922 1726853690.35911: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 29922 1726853690.35991: in run() - task 02083763-bbaf-51d4-513b-0000000000a3 29922 1726853690.36003: variable 'ansible_search_path' from source: unknown 29922 1726853690.36007: variable 'ansible_search_path' from source: unknown 29922 1726853690.36035: calling self._execute() 29922 1726853690.36113: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853690.36117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853690.36126: variable 'omit' from source: magic vars 29922 1726853690.36405: variable 'ansible_distribution_major_version' from source: facts 29922 1726853690.36416: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853690.36421: variable 'omit' from source: magic vars 29922 1726853690.36445: variable 'omit' from source: magic vars 29922 1726853690.36474: variable 'omit' from source: magic vars 29922 1726853690.36506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853690.36531: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853690.36546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853690.36562: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853690.36574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853690.36599: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853690.36602: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853690.36605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853690.36672: Set connection var ansible_connection to ssh 29922 1726853690.36679: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853690.36686: Set connection var ansible_shell_executable to /bin/sh 29922 1726853690.36692: Set connection var ansible_pipelining to False 29922 1726853690.36697: Set connection var ansible_timeout to 10 29922 1726853690.36700: Set connection var ansible_shell_type to sh 29922 1726853690.36722: variable 'ansible_shell_executable' from source: unknown 29922 1726853690.36725: variable 'ansible_connection' from source: unknown 29922 1726853690.36728: variable 'ansible_module_compression' from source: unknown 29922 1726853690.36730: variable 'ansible_shell_type' from source: unknown 29922 1726853690.36733: variable 'ansible_shell_executable' from source: unknown 29922 1726853690.36735: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853690.36737: variable 'ansible_pipelining' from source: unknown 29922 1726853690.36739: variable 'ansible_timeout' from source: unknown 29922 1726853690.36741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853690.36843: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853690.36853: variable 'omit' from source: magic vars 29922 1726853690.36860: starting attempt loop 29922 1726853690.36863: running the handler 29922 1726853690.36954: variable '__network_connections_result' from source: set_fact 29922 1726853690.36998: handler run complete 29922 1726853690.37010: attempt loop complete, returning result 29922 1726853690.37014: _execute() done 29922 1726853690.37016: dumping result to json 29922 1726853690.37019: done dumping result, returning 29922 1726853690.37027: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-51d4-513b-0000000000a3] 29922 1726853690.37029: sending task result for task 02083763-bbaf-51d4-513b-0000000000a3 29922 1726853690.37112: done sending task result for task 02083763-bbaf-51d4-513b-0000000000a3 29922 1726853690.37115: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 29922 1726853690.37206: no more pending results, returning what we have 29922 1726853690.37209: results queue empty 29922 1726853690.37210: checking for any_errors_fatal 29922 1726853690.37218: done checking for any_errors_fatal 29922 1726853690.37219: checking for max_fail_percentage 29922 1726853690.37220: done checking for max_fail_percentage 29922 1726853690.37221: checking to see if all hosts have failed and the running result is not ok 29922 1726853690.37222: done checking to see if all hosts have failed 29922 1726853690.37223: getting the remaining hosts for this loop 29922 1726853690.37224: done getting the remaining hosts for this loop 29922 1726853690.37227: getting the next task for host managed_node3 29922 1726853690.37231: done getting next task for host managed_node3 29922 1726853690.37234: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 29922 1726853690.37236: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853690.37245: getting variables 29922 1726853690.37246: in VariableManager get_vars() 29922 1726853690.37278: Calling all_inventory to load vars for managed_node3 29922 1726853690.37280: Calling groups_inventory to load vars for managed_node3 29922 1726853690.37282: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853690.37291: Calling all_plugins_play to load vars for managed_node3 29922 1726853690.37293: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853690.37296: Calling groups_plugins_play to load vars for managed_node3 29922 1726853690.38061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853690.39517: done with get_vars() 29922 1726853690.39534: done getting variables 29922 1726853690.39582: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:34:50 -0400 (0:00:00.041) 0:00:39.325 ****** 29922 1726853690.39604: entering _queue_task() for managed_node3/debug 29922 1726853690.39836: worker is 1 (out of 1 available) 29922 1726853690.39849: exiting _queue_task() for managed_node3/debug 29922 1726853690.39859: done queuing things up, now waiting for results queue to drain 29922 1726853690.39861: waiting for pending results... 29922 1726853690.40036: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 29922 1726853690.40107: in run() - task 02083763-bbaf-51d4-513b-0000000000a4 29922 1726853690.40119: variable 'ansible_search_path' from source: unknown 29922 1726853690.40122: variable 'ansible_search_path' from source: unknown 29922 1726853690.40149: calling self._execute() 29922 1726853690.40231: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853690.40235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853690.40245: variable 'omit' from source: magic vars 29922 1726853690.40527: variable 'ansible_distribution_major_version' from source: facts 29922 1726853690.40534: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853690.40540: variable 'omit' from source: magic vars 29922 1726853690.40570: variable 'omit' from source: magic vars 29922 1726853690.40596: variable 'omit' from source: magic vars 29922 1726853690.40628: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853690.40655: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853690.40674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853690.40687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853690.40697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853690.40721: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853690.40724: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853690.40727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853690.40796: Set connection var ansible_connection to ssh 29922 1726853690.40803: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853690.40810: Set connection var ansible_shell_executable to /bin/sh 29922 1726853690.40816: Set connection var ansible_pipelining to False 29922 1726853690.40821: Set connection var ansible_timeout to 10 29922 1726853690.40824: Set connection var ansible_shell_type to sh 29922 1726853690.40841: variable 'ansible_shell_executable' from source: unknown 29922 1726853690.40844: variable 'ansible_connection' from source: unknown 29922 1726853690.40849: variable 'ansible_module_compression' from source: unknown 29922 1726853690.40851: variable 'ansible_shell_type' from source: unknown 29922 1726853690.40854: variable 'ansible_shell_executable' from source: unknown 29922 1726853690.40856: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853690.40858: variable 'ansible_pipelining' from source: unknown 29922 1726853690.40860: variable 'ansible_timeout' from source: unknown 29922 1726853690.40868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853690.40962: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853690.40974: variable 'omit' from source: magic vars 29922 1726853690.40977: starting attempt loop 29922 1726853690.40980: running the handler 29922 1726853690.41017: variable '__network_connections_result' from source: set_fact 29922 1726853690.41073: variable '__network_connections_result' from source: set_fact 29922 1726853690.41143: handler run complete 29922 1726853690.41160: attempt loop complete, returning result 29922 1726853690.41163: _execute() done 29922 1726853690.41166: dumping result to json 29922 1726853690.41168: done dumping result, returning 29922 1726853690.41179: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-51d4-513b-0000000000a4] 29922 1726853690.41181: sending task result for task 02083763-bbaf-51d4-513b-0000000000a4 29922 1726853690.41262: done sending task result for task 02083763-bbaf-51d4-513b-0000000000a4 29922 1726853690.41265: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 29922 1726853690.41377: no more pending results, returning what we have 29922 1726853690.41380: results queue empty 29922 1726853690.41381: checking for any_errors_fatal 29922 1726853690.41385: done checking for any_errors_fatal 29922 1726853690.41386: checking for max_fail_percentage 29922 1726853690.41387: done checking for max_fail_percentage 29922 1726853690.41388: checking to see if all hosts have failed and the running result is not ok 29922 1726853690.41388: done checking to see if all hosts have failed 29922 1726853690.41389: getting the remaining hosts for this loop 29922 1726853690.41390: done getting the remaining hosts for this loop 29922 1726853690.41393: getting the next task for host managed_node3 29922 1726853690.41397: done getting next task for host managed_node3 29922 1726853690.41400: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 29922 1726853690.41402: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853690.41411: getting variables 29922 1726853690.41412: in VariableManager get_vars() 29922 1726853690.41440: Calling all_inventory to load vars for managed_node3 29922 1726853690.41443: Calling groups_inventory to load vars for managed_node3 29922 1726853690.41444: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853690.41452: Calling all_plugins_play to load vars for managed_node3 29922 1726853690.41454: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853690.41457: Calling groups_plugins_play to load vars for managed_node3 29922 1726853690.42580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853690.44098: done with get_vars() 29922 1726853690.44123: done getting variables 29922 1726853690.44183: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:34:50 -0400 (0:00:00.046) 0:00:39.371 ****** 29922 1726853690.44217: entering _queue_task() for managed_node3/debug 29922 1726853690.44530: worker is 1 (out of 1 available) 29922 1726853690.44542: exiting _queue_task() for managed_node3/debug 29922 1726853690.44554: done queuing things up, now waiting for results queue to drain 29922 1726853690.44555: waiting for pending results... 29922 1726853690.44989: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 29922 1726853690.44994: in run() - task 02083763-bbaf-51d4-513b-0000000000a5 29922 1726853690.44997: variable 'ansible_search_path' from source: unknown 29922 1726853690.45000: variable 'ansible_search_path' from source: unknown 29922 1726853690.45007: calling self._execute() 29922 1726853690.45117: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853690.45129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853690.45144: variable 'omit' from source: magic vars 29922 1726853690.45516: variable 'ansible_distribution_major_version' from source: facts 29922 1726853690.45533: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853690.45656: variable 'network_state' from source: role '' defaults 29922 1726853690.45675: Evaluated conditional (network_state != {}): False 29922 1726853690.45682: when evaluation is False, skipping this task 29922 1726853690.45689: _execute() done 29922 1726853690.45695: dumping result to json 29922 1726853690.45702: done dumping result, returning 29922 1726853690.45714: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-51d4-513b-0000000000a5] 29922 1726853690.45722: sending task result for task 02083763-bbaf-51d4-513b-0000000000a5 skipping: [managed_node3] => { "false_condition": "network_state != {}" } 29922 1726853690.45920: no more pending results, returning what we have 29922 1726853690.45924: results queue empty 29922 1726853690.45925: checking for any_errors_fatal 29922 1726853690.45936: done checking for any_errors_fatal 29922 1726853690.45937: checking for max_fail_percentage 29922 1726853690.45939: done checking for max_fail_percentage 29922 1726853690.45940: checking to see if all hosts have failed and the running result is not ok 29922 1726853690.45941: done checking to see if all hosts have failed 29922 1726853690.45941: getting the remaining hosts for this loop 29922 1726853690.45943: done getting the remaining hosts for this loop 29922 1726853690.45947: getting the next task for host managed_node3 29922 1726853690.45952: done getting next task for host managed_node3 29922 1726853690.45956: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 29922 1726853690.45959: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853690.45976: getting variables 29922 1726853690.45977: in VariableManager get_vars() 29922 1726853690.46015: Calling all_inventory to load vars for managed_node3 29922 1726853690.46017: Calling groups_inventory to load vars for managed_node3 29922 1726853690.46019: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853690.46031: Calling all_plugins_play to load vars for managed_node3 29922 1726853690.46035: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853690.46038: Calling groups_plugins_play to load vars for managed_node3 29922 1726853690.46685: done sending task result for task 02083763-bbaf-51d4-513b-0000000000a5 29922 1726853690.46689: WORKER PROCESS EXITING 29922 1726853690.47614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853690.49332: done with get_vars() 29922 1726853690.49353: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:34:50 -0400 (0:00:00.052) 0:00:39.423 ****** 29922 1726853690.49440: entering _queue_task() for managed_node3/ping 29922 1726853690.49772: worker is 1 (out of 1 available) 29922 1726853690.49784: exiting _queue_task() for managed_node3/ping 29922 1726853690.49795: done queuing things up, now waiting for results queue to drain 29922 1726853690.49796: waiting for pending results... 29922 1726853690.50078: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 29922 1726853690.50192: in run() - task 02083763-bbaf-51d4-513b-0000000000a6 29922 1726853690.50216: variable 'ansible_search_path' from source: unknown 29922 1726853690.50224: variable 'ansible_search_path' from source: unknown 29922 1726853690.50264: calling self._execute() 29922 1726853690.50379: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853690.50391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853690.50406: variable 'omit' from source: magic vars 29922 1726853690.50827: variable 'ansible_distribution_major_version' from source: facts 29922 1726853690.50844: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853690.50862: variable 'omit' from source: magic vars 29922 1726853690.50902: variable 'omit' from source: magic vars 29922 1726853690.50962: variable 'omit' from source: magic vars 29922 1726853690.50989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853690.51029: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853690.51070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853690.51081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853690.51098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853690.51177: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853690.51181: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853690.51184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853690.51249: Set connection var ansible_connection to ssh 29922 1726853690.51263: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853690.51281: Set connection var ansible_shell_executable to /bin/sh 29922 1726853690.51297: Set connection var ansible_pipelining to False 29922 1726853690.51308: Set connection var ansible_timeout to 10 29922 1726853690.51316: Set connection var ansible_shell_type to sh 29922 1726853690.51376: variable 'ansible_shell_executable' from source: unknown 29922 1726853690.51379: variable 'ansible_connection' from source: unknown 29922 1726853690.51382: variable 'ansible_module_compression' from source: unknown 29922 1726853690.51384: variable 'ansible_shell_type' from source: unknown 29922 1726853690.51386: variable 'ansible_shell_executable' from source: unknown 29922 1726853690.51388: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853690.51393: variable 'ansible_pipelining' from source: unknown 29922 1726853690.51398: variable 'ansible_timeout' from source: unknown 29922 1726853690.51401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853690.51594: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853690.51616: variable 'omit' from source: magic vars 29922 1726853690.51725: starting attempt loop 29922 1726853690.51729: running the handler 29922 1726853690.51731: _low_level_execute_command(): starting 29922 1726853690.51733: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853690.52394: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853690.52495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853690.52551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853690.52595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853690.52731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853690.54388: stdout chunk (state=3): >>>/root <<< 29922 1726853690.54520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853690.54533: stdout chunk (state=3): >>><<< 29922 1726853690.54546: stderr chunk (state=3): >>><<< 29922 1726853690.54582: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853690.54602: _low_level_execute_command(): starting 29922 1726853690.54613: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853690.545901-31744-235242521595155 `" && echo ansible-tmp-1726853690.545901-31744-235242521595155="` echo /root/.ansible/tmp/ansible-tmp-1726853690.545901-31744-235242521595155 `" ) && sleep 0' 29922 1726853690.55202: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853690.55221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853690.55236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853690.55256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853690.55334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853690.55348: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853690.55403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853690.55421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853690.55447: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853690.55558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853690.58182: stdout chunk (state=3): >>>ansible-tmp-1726853690.545901-31744-235242521595155=/root/.ansible/tmp/ansible-tmp-1726853690.545901-31744-235242521595155 <<< 29922 1726853690.58187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853690.58189: stdout chunk (state=3): >>><<< 29922 1726853690.58191: stderr chunk (state=3): >>><<< 29922 1726853690.58194: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853690.545901-31744-235242521595155=/root/.ansible/tmp/ansible-tmp-1726853690.545901-31744-235242521595155 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853690.58196: variable 'ansible_module_compression' from source: unknown 29922 1726853690.58198: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 29922 1726853690.58223: variable 'ansible_facts' from source: unknown 29922 1726853690.58408: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853690.545901-31744-235242521595155/AnsiballZ_ping.py 29922 1726853690.58831: Sending initial data 29922 1726853690.58840: Sent initial data (152 bytes) 29922 1726853690.59542: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853690.59557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853690.59574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853690.59594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853690.59612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853690.59624: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853690.59638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853690.59655: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853690.59668: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853690.59680: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29922 1726853690.59691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853690.59775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853690.59977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853690.60054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853690.61715: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853690.61774: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853690.61877: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp2x1ifucj /root/.ansible/tmp/ansible-tmp-1726853690.545901-31744-235242521595155/AnsiballZ_ping.py <<< 29922 1726853690.61889: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853690.545901-31744-235242521595155/AnsiballZ_ping.py" <<< 29922 1726853690.61980: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp2x1ifucj" to remote "/root/.ansible/tmp/ansible-tmp-1726853690.545901-31744-235242521595155/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853690.545901-31744-235242521595155/AnsiballZ_ping.py" <<< 29922 1726853690.63749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853690.63869: stderr chunk (state=3): >>><<< 29922 1726853690.63880: stdout chunk (state=3): >>><<< 29922 1726853690.63905: done transferring module to remote 29922 1726853690.63981: _low_level_execute_command(): starting 29922 1726853690.63991: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853690.545901-31744-235242521595155/ /root/.ansible/tmp/ansible-tmp-1726853690.545901-31744-235242521595155/AnsiballZ_ping.py && sleep 0' 29922 1726853690.65189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853690.65290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853690.65402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853690.65537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853690.65630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853690.67733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853690.67737: stdout chunk (state=3): >>><<< 29922 1726853690.67739: stderr chunk (state=3): >>><<< 29922 1726853690.67742: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853690.67744: _low_level_execute_command(): starting 29922 1726853690.67747: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853690.545901-31744-235242521595155/AnsiballZ_ping.py && sleep 0' 29922 1726853690.68996: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853690.69115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853690.69211: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853690.69341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853690.69448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853690.84831: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 29922 1726853690.86152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853690.86199: stderr chunk (state=3): >>><<< 29922 1726853690.86210: stdout chunk (state=3): >>><<< 29922 1726853690.86234: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853690.86274: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853690.545901-31744-235242521595155/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853690.86290: _low_level_execute_command(): starting 29922 1726853690.86300: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853690.545901-31744-235242521595155/ > /dev/null 2>&1 && sleep 0' 29922 1726853690.86919: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853690.86937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853690.86953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853690.86973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853690.86991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853690.87041: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853690.87103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853690.87135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853690.87178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853690.87244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853690.89251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853690.89255: stdout chunk (state=3): >>><<< 29922 1726853690.89257: stderr chunk (state=3): >>><<< 29922 1726853690.89598: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853690.89608: handler run complete 29922 1726853690.89611: attempt loop complete, returning result 29922 1726853690.89613: _execute() done 29922 1726853690.89615: dumping result to json 29922 1726853690.89617: done dumping result, returning 29922 1726853690.89619: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-51d4-513b-0000000000a6] 29922 1726853690.89622: sending task result for task 02083763-bbaf-51d4-513b-0000000000a6 ok: [managed_node3] => { "changed": false, "ping": "pong" } 29922 1726853690.89787: no more pending results, returning what we have 29922 1726853690.89790: results queue empty 29922 1726853690.89791: checking for any_errors_fatal 29922 1726853690.89800: done checking for any_errors_fatal 29922 1726853690.89801: checking for max_fail_percentage 29922 1726853690.89804: done checking for max_fail_percentage 29922 1726853690.89805: checking to see if all hosts have failed and the running result is not ok 29922 1726853690.89806: done checking to see if all hosts have failed 29922 1726853690.89806: getting the remaining hosts for this loop 29922 1726853690.89808: done getting the remaining hosts for this loop 29922 1726853690.89812: getting the next task for host managed_node3 29922 1726853690.89819: done getting next task for host managed_node3 29922 1726853690.89821: ^ task is: TASK: meta (role_complete) 29922 1726853690.89824: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853690.89836: getting variables 29922 1726853690.89838: in VariableManager get_vars() 29922 1726853690.90075: Calling all_inventory to load vars for managed_node3 29922 1726853690.90078: Calling groups_inventory to load vars for managed_node3 29922 1726853690.90082: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853690.90093: Calling all_plugins_play to load vars for managed_node3 29922 1726853690.90096: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853690.90099: Calling groups_plugins_play to load vars for managed_node3 29922 1726853690.90890: done sending task result for task 02083763-bbaf-51d4-513b-0000000000a6 29922 1726853690.90894: WORKER PROCESS EXITING 29922 1726853690.93269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853690.95401: done with get_vars() 29922 1726853690.95434: done getting variables 29922 1726853690.95518: done queuing things up, now waiting for results queue to drain 29922 1726853690.95520: results queue empty 29922 1726853690.95521: checking for any_errors_fatal 29922 1726853690.95524: done checking for any_errors_fatal 29922 1726853690.95525: checking for max_fail_percentage 29922 1726853690.95526: done checking for max_fail_percentage 29922 1726853690.95527: checking to see if all hosts have failed and the running result is not ok 29922 1726853690.95527: done checking to see if all hosts have failed 29922 1726853690.95528: getting the remaining hosts for this loop 29922 1726853690.95529: done getting the remaining hosts for this loop 29922 1726853690.95532: getting the next task for host managed_node3 29922 1726853690.95536: done getting next task for host managed_node3 29922 1726853690.95537: ^ task is: TASK: meta (flush_handlers) 29922 1726853690.95539: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853690.95542: getting variables 29922 1726853690.95543: in VariableManager get_vars() 29922 1726853690.95559: Calling all_inventory to load vars for managed_node3 29922 1726853690.95561: Calling groups_inventory to load vars for managed_node3 29922 1726853690.95563: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853690.95568: Calling all_plugins_play to load vars for managed_node3 29922 1726853690.95572: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853690.95576: Calling groups_plugins_play to load vars for managed_node3 29922 1726853690.97007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853690.98846: done with get_vars() 29922 1726853690.98883: done getting variables 29922 1726853690.98962: in VariableManager get_vars() 29922 1726853690.98979: Calling all_inventory to load vars for managed_node3 29922 1726853690.98981: Calling groups_inventory to load vars for managed_node3 29922 1726853690.99001: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853690.99008: Calling all_plugins_play to load vars for managed_node3 29922 1726853690.99011: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853690.99014: Calling groups_plugins_play to load vars for managed_node3 29922 1726853691.00335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853691.01990: done with get_vars() 29922 1726853691.02027: done queuing things up, now waiting for results queue to drain 29922 1726853691.02030: results queue empty 29922 1726853691.02030: checking for any_errors_fatal 29922 1726853691.02032: done checking for any_errors_fatal 29922 1726853691.02033: checking for max_fail_percentage 29922 1726853691.02034: done checking for max_fail_percentage 29922 1726853691.02034: checking to see if all hosts have failed and the running result is not ok 29922 1726853691.02035: done checking to see if all hosts have failed 29922 1726853691.02036: getting the remaining hosts for this loop 29922 1726853691.02037: done getting the remaining hosts for this loop 29922 1726853691.02040: getting the next task for host managed_node3 29922 1726853691.02043: done getting next task for host managed_node3 29922 1726853691.02045: ^ task is: TASK: meta (flush_handlers) 29922 1726853691.02047: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853691.02049: getting variables 29922 1726853691.02050: in VariableManager get_vars() 29922 1726853691.02067: Calling all_inventory to load vars for managed_node3 29922 1726853691.02069: Calling groups_inventory to load vars for managed_node3 29922 1726853691.02073: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853691.02079: Calling all_plugins_play to load vars for managed_node3 29922 1726853691.02082: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853691.02085: Calling groups_plugins_play to load vars for managed_node3 29922 1726853691.03314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853691.04993: done with get_vars() 29922 1726853691.05025: done getting variables 29922 1726853691.05086: in VariableManager get_vars() 29922 1726853691.05101: Calling all_inventory to load vars for managed_node3 29922 1726853691.05104: Calling groups_inventory to load vars for managed_node3 29922 1726853691.05106: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853691.05111: Calling all_plugins_play to load vars for managed_node3 29922 1726853691.05114: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853691.05117: Calling groups_plugins_play to load vars for managed_node3 29922 1726853691.06327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853691.07997: done with get_vars() 29922 1726853691.08035: done queuing things up, now waiting for results queue to drain 29922 1726853691.08037: results queue empty 29922 1726853691.08038: checking for any_errors_fatal 29922 1726853691.08039: done checking for any_errors_fatal 29922 1726853691.08040: checking for max_fail_percentage 29922 1726853691.08041: done checking for max_fail_percentage 29922 1726853691.08042: checking to see if all hosts have failed and the running result is not ok 29922 1726853691.08043: done checking to see if all hosts have failed 29922 1726853691.08044: getting the remaining hosts for this loop 29922 1726853691.08045: done getting the remaining hosts for this loop 29922 1726853691.08047: getting the next task for host managed_node3 29922 1726853691.08051: done getting next task for host managed_node3 29922 1726853691.08052: ^ task is: None 29922 1726853691.08053: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853691.08057: done queuing things up, now waiting for results queue to drain 29922 1726853691.08058: results queue empty 29922 1726853691.08059: checking for any_errors_fatal 29922 1726853691.08059: done checking for any_errors_fatal 29922 1726853691.08060: checking for max_fail_percentage 29922 1726853691.08061: done checking for max_fail_percentage 29922 1726853691.08062: checking to see if all hosts have failed and the running result is not ok 29922 1726853691.08063: done checking to see if all hosts have failed 29922 1726853691.08064: getting the next task for host managed_node3 29922 1726853691.08066: done getting next task for host managed_node3 29922 1726853691.08067: ^ task is: None 29922 1726853691.08068: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853691.08119: in VariableManager get_vars() 29922 1726853691.08136: done with get_vars() 29922 1726853691.08142: in VariableManager get_vars() 29922 1726853691.08151: done with get_vars() 29922 1726853691.08157: variable 'omit' from source: magic vars 29922 1726853691.08192: in VariableManager get_vars() 29922 1726853691.08203: done with get_vars() 29922 1726853691.08224: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 29922 1726853691.08476: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 29922 1726853691.08501: getting the remaining hosts for this loop 29922 1726853691.08502: done getting the remaining hosts for this loop 29922 1726853691.08504: getting the next task for host managed_node3 29922 1726853691.08507: done getting next task for host managed_node3 29922 1726853691.08509: ^ task is: TASK: Gathering Facts 29922 1726853691.08510: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853691.08513: getting variables 29922 1726853691.08513: in VariableManager get_vars() 29922 1726853691.08523: Calling all_inventory to load vars for managed_node3 29922 1726853691.08526: Calling groups_inventory to load vars for managed_node3 29922 1726853691.08528: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853691.08534: Calling all_plugins_play to load vars for managed_node3 29922 1726853691.08537: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853691.08540: Calling groups_plugins_play to load vars for managed_node3 29922 1726853691.09899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853691.11490: done with get_vars() 29922 1726853691.11515: done getting variables 29922 1726853691.11564: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:227 Friday 20 September 2024 13:34:51 -0400 (0:00:00.621) 0:00:40.045 ****** 29922 1726853691.11592: entering _queue_task() for managed_node3/gather_facts 29922 1726853691.11938: worker is 1 (out of 1 available) 29922 1726853691.11949: exiting _queue_task() for managed_node3/gather_facts 29922 1726853691.11962: done queuing things up, now waiting for results queue to drain 29922 1726853691.11964: waiting for pending results... 29922 1726853691.12249: running TaskExecutor() for managed_node3/TASK: Gathering Facts 29922 1726853691.12395: in run() - task 02083763-bbaf-51d4-513b-00000000066a 29922 1726853691.12400: variable 'ansible_search_path' from source: unknown 29922 1726853691.12427: calling self._execute() 29922 1726853691.12530: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853691.12612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853691.12615: variable 'omit' from source: magic vars 29922 1726853691.12966: variable 'ansible_distribution_major_version' from source: facts 29922 1726853691.12986: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853691.12996: variable 'omit' from source: magic vars 29922 1726853691.13022: variable 'omit' from source: magic vars 29922 1726853691.13067: variable 'omit' from source: magic vars 29922 1726853691.13113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853691.13158: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853691.13185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853691.13206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853691.13221: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853691.13253: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853691.13373: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853691.13376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853691.13378: Set connection var ansible_connection to ssh 29922 1726853691.13386: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853691.13399: Set connection var ansible_shell_executable to /bin/sh 29922 1726853691.13410: Set connection var ansible_pipelining to False 29922 1726853691.13418: Set connection var ansible_timeout to 10 29922 1726853691.13424: Set connection var ansible_shell_type to sh 29922 1726853691.13452: variable 'ansible_shell_executable' from source: unknown 29922 1726853691.13462: variable 'ansible_connection' from source: unknown 29922 1726853691.13468: variable 'ansible_module_compression' from source: unknown 29922 1726853691.13479: variable 'ansible_shell_type' from source: unknown 29922 1726853691.13485: variable 'ansible_shell_executable' from source: unknown 29922 1726853691.13491: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853691.13498: variable 'ansible_pipelining' from source: unknown 29922 1726853691.13504: variable 'ansible_timeout' from source: unknown 29922 1726853691.13511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853691.13695: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853691.13710: variable 'omit' from source: magic vars 29922 1726853691.13720: starting attempt loop 29922 1726853691.13727: running the handler 29922 1726853691.13746: variable 'ansible_facts' from source: unknown 29922 1726853691.13773: _low_level_execute_command(): starting 29922 1726853691.13786: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853691.14587: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853691.14633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853691.14651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853691.14681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853691.14778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853691.16482: stdout chunk (state=3): >>>/root <<< 29922 1726853691.16644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853691.16648: stdout chunk (state=3): >>><<< 29922 1726853691.16650: stderr chunk (state=3): >>><<< 29922 1726853691.16681: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853691.16703: _low_level_execute_command(): starting 29922 1726853691.16793: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853691.1668901-31785-265341829852985 `" && echo ansible-tmp-1726853691.1668901-31785-265341829852985="` echo /root/.ansible/tmp/ansible-tmp-1726853691.1668901-31785-265341829852985 `" ) && sleep 0' 29922 1726853691.17385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853691.17399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853691.17416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853691.17469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853691.17553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853691.17591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853691.17690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853691.19613: stdout chunk (state=3): >>>ansible-tmp-1726853691.1668901-31785-265341829852985=/root/.ansible/tmp/ansible-tmp-1726853691.1668901-31785-265341829852985 <<< 29922 1726853691.19782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853691.19785: stdout chunk (state=3): >>><<< 29922 1726853691.19788: stderr chunk (state=3): >>><<< 29922 1726853691.19877: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853691.1668901-31785-265341829852985=/root/.ansible/tmp/ansible-tmp-1726853691.1668901-31785-265341829852985 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853691.19881: variable 'ansible_module_compression' from source: unknown 29922 1726853691.19916: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 29922 1726853691.19984: variable 'ansible_facts' from source: unknown 29922 1726853691.20215: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853691.1668901-31785-265341829852985/AnsiballZ_setup.py 29922 1726853691.20474: Sending initial data 29922 1726853691.20477: Sent initial data (154 bytes) 29922 1726853691.21016: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853691.21030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853691.21086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853691.21159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853691.21180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853691.21212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853691.21299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853691.22930: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853691.22998: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853691.23078: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpss7f77ak /root/.ansible/tmp/ansible-tmp-1726853691.1668901-31785-265341829852985/AnsiballZ_setup.py <<< 29922 1726853691.23082: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853691.1668901-31785-265341829852985/AnsiballZ_setup.py" <<< 29922 1726853691.23142: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpss7f77ak" to remote "/root/.ansible/tmp/ansible-tmp-1726853691.1668901-31785-265341829852985/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853691.1668901-31785-265341829852985/AnsiballZ_setup.py" <<< 29922 1726853691.24795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853691.24809: stderr chunk (state=3): >>><<< 29922 1726853691.24935: stdout chunk (state=3): >>><<< 29922 1726853691.24938: done transferring module to remote 29922 1726853691.24940: _low_level_execute_command(): starting 29922 1726853691.24943: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853691.1668901-31785-265341829852985/ /root/.ansible/tmp/ansible-tmp-1726853691.1668901-31785-265341829852985/AnsiballZ_setup.py && sleep 0' 29922 1726853691.25518: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853691.25543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853691.25624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853691.27554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853691.27558: stdout chunk (state=3): >>><<< 29922 1726853691.27561: stderr chunk (state=3): >>><<< 29922 1726853691.27588: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853691.27676: _low_level_execute_command(): starting 29922 1726853691.27680: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853691.1668901-31785-265341829852985/AnsiballZ_setup.py && sleep 0' 29922 1726853691.28307: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853691.28328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853691.28349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853691.28385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853691.28398: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29922 1726853691.28453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853691.28506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853691.28525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853691.28580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853691.28695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853691.93830: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "51", "epoch": "1726853691", "epoch_int": "1726853691", "date": "2024-09-20", "time": "13:34:51", "iso8601_micro": "2024-09-20T17:34:51.571691Z", "iso8601": "2024-09-20T17:34:51Z", "iso8601_basic": "20240920T133451571691", "iso8601_basic_short": "20240920T133451", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.603515625, "5m": 0.517578125, "15m": 0.31494140625}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_co<<< 29922 1726853691.93856: stdout chunk (state=3): >>>re": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2970, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 561, "free": 2970}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 835, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261798715392, "block_size": 4096, "block_total": 65519099, "block_available": 63915702, "block_used": 1603397, "inode_total": 131070960, "inode_available": 131029146, "inode_used": 41814, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "rpltstbr", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "mo<<< 29922 1726853691.93866: stdout chunk (state=3): >>>dule": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_se<<< 29922 1726853691.93904: stdout chunk (state=3): >>>gmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 29922 1726853691.95950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853691.95978: stderr chunk (state=3): >>><<< 29922 1726853691.95982: stdout chunk (state=3): >>><<< 29922 1726853691.96039: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "34", "second": "51", "epoch": "1726853691", "epoch_int": "1726853691", "date": "2024-09-20", "time": "13:34:51", "iso8601_micro": "2024-09-20T17:34:51.571691Z", "iso8601": "2024-09-20T17:34:51Z", "iso8601_basic": "20240920T133451571691", "iso8601_basic_short": "20240920T133451", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.603515625, "5m": 0.517578125, "15m": 0.31494140625}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2970, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 561, "free": 2970}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 835, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261798715392, "block_size": 4096, "block_total": 65519099, "block_available": 63915702, "block_used": 1603397, "inode_total": 131070960, "inode_available": 131029146, "inode_used": 41814, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "rpltstbr", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:7c:f1:8e:1c:81", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853691.96380: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853691.1668901-31785-265341829852985/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853691.96487: _low_level_execute_command(): starting 29922 1726853691.96491: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853691.1668901-31785-265341829852985/ > /dev/null 2>&1 && sleep 0' 29922 1726853691.97011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853691.97015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853691.97085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853691.97106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853691.97129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853691.97152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853691.97244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853691.99094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853691.99146: stderr chunk (state=3): >>><<< 29922 1726853691.99160: stdout chunk (state=3): >>><<< 29922 1726853691.99187: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853691.99378: handler run complete 29922 1726853691.99382: variable 'ansible_facts' from source: unknown 29922 1726853691.99462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853691.99808: variable 'ansible_facts' from source: unknown 29922 1726853691.99904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853692.00057: attempt loop complete, returning result 29922 1726853692.00067: _execute() done 29922 1726853692.00078: dumping result to json 29922 1726853692.00116: done dumping result, returning 29922 1726853692.00127: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [02083763-bbaf-51d4-513b-00000000066a] 29922 1726853692.00135: sending task result for task 02083763-bbaf-51d4-513b-00000000066a 29922 1726853692.00777: done sending task result for task 02083763-bbaf-51d4-513b-00000000066a 29922 1726853692.00780: WORKER PROCESS EXITING ok: [managed_node3] 29922 1726853692.01236: no more pending results, returning what we have 29922 1726853692.01239: results queue empty 29922 1726853692.01240: checking for any_errors_fatal 29922 1726853692.01241: done checking for any_errors_fatal 29922 1726853692.01242: checking for max_fail_percentage 29922 1726853692.01243: done checking for max_fail_percentage 29922 1726853692.01244: checking to see if all hosts have failed and the running result is not ok 29922 1726853692.01245: done checking to see if all hosts have failed 29922 1726853692.01246: getting the remaining hosts for this loop 29922 1726853692.01247: done getting the remaining hosts for this loop 29922 1726853692.01250: getting the next task for host managed_node3 29922 1726853692.01255: done getting next task for host managed_node3 29922 1726853692.01257: ^ task is: TASK: meta (flush_handlers) 29922 1726853692.01259: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853692.01264: getting variables 29922 1726853692.01267: in VariableManager get_vars() 29922 1726853692.01294: Calling all_inventory to load vars for managed_node3 29922 1726853692.01297: Calling groups_inventory to load vars for managed_node3 29922 1726853692.01301: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853692.01311: Calling all_plugins_play to load vars for managed_node3 29922 1726853692.01315: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853692.01318: Calling groups_plugins_play to load vars for managed_node3 29922 1726853692.02767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853692.04434: done with get_vars() 29922 1726853692.04463: done getting variables 29922 1726853692.04532: in VariableManager get_vars() 29922 1726853692.04542: Calling all_inventory to load vars for managed_node3 29922 1726853692.04544: Calling groups_inventory to load vars for managed_node3 29922 1726853692.04547: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853692.04551: Calling all_plugins_play to load vars for managed_node3 29922 1726853692.04553: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853692.04559: Calling groups_plugins_play to load vars for managed_node3 29922 1726853692.05744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853692.07838: done with get_vars() 29922 1726853692.07876: done queuing things up, now waiting for results queue to drain 29922 1726853692.07910: results queue empty 29922 1726853692.07911: checking for any_errors_fatal 29922 1726853692.07915: done checking for any_errors_fatal 29922 1726853692.07916: checking for max_fail_percentage 29922 1726853692.07917: done checking for max_fail_percentage 29922 1726853692.07923: checking to see if all hosts have failed and the running result is not ok 29922 1726853692.07924: done checking to see if all hosts have failed 29922 1726853692.07924: getting the remaining hosts for this loop 29922 1726853692.07926: done getting the remaining hosts for this loop 29922 1726853692.07929: getting the next task for host managed_node3 29922 1726853692.07933: done getting next task for host managed_node3 29922 1726853692.07936: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 29922 1726853692.07937: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853692.07970: getting variables 29922 1726853692.07974: in VariableManager get_vars() 29922 1726853692.07986: Calling all_inventory to load vars for managed_node3 29922 1726853692.07989: Calling groups_inventory to load vars for managed_node3 29922 1726853692.07999: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853692.08076: Calling all_plugins_play to load vars for managed_node3 29922 1726853692.08107: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853692.08112: Calling groups_plugins_play to load vars for managed_node3 29922 1726853692.10583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853692.12288: done with get_vars() 29922 1726853692.12317: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:230 Friday 20 September 2024 13:34:52 -0400 (0:00:01.008) 0:00:41.053 ****** 29922 1726853692.12399: entering _queue_task() for managed_node3/include_tasks 29922 1726853692.12777: worker is 1 (out of 1 available) 29922 1726853692.12793: exiting _queue_task() for managed_node3/include_tasks 29922 1726853692.12812: done queuing things up, now waiting for results queue to drain 29922 1726853692.12813: waiting for pending results... 29922 1726853692.13167: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_absent.yml' 29922 1726853692.13323: in run() - task 02083763-bbaf-51d4-513b-0000000000a9 29922 1726853692.13344: variable 'ansible_search_path' from source: unknown 29922 1726853692.13392: calling self._execute() 29922 1726853692.13510: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853692.13531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853692.13548: variable 'omit' from source: magic vars 29922 1726853692.13987: variable 'ansible_distribution_major_version' from source: facts 29922 1726853692.14043: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853692.14047: _execute() done 29922 1726853692.14050: dumping result to json 29922 1726853692.14053: done dumping result, returning 29922 1726853692.14061: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_absent.yml' [02083763-bbaf-51d4-513b-0000000000a9] 29922 1726853692.14063: sending task result for task 02083763-bbaf-51d4-513b-0000000000a9 29922 1726853692.14346: done sending task result for task 02083763-bbaf-51d4-513b-0000000000a9 29922 1726853692.14350: WORKER PROCESS EXITING 29922 1726853692.14386: no more pending results, returning what we have 29922 1726853692.14392: in VariableManager get_vars() 29922 1726853692.14444: Calling all_inventory to load vars for managed_node3 29922 1726853692.14447: Calling groups_inventory to load vars for managed_node3 29922 1726853692.14451: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853692.14468: Calling all_plugins_play to load vars for managed_node3 29922 1726853692.14474: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853692.14477: Calling groups_plugins_play to load vars for managed_node3 29922 1726853692.16174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853692.18032: done with get_vars() 29922 1726853692.18051: variable 'ansible_search_path' from source: unknown 29922 1726853692.18069: we have included files to process 29922 1726853692.18070: generating all_blocks data 29922 1726853692.18073: done generating all_blocks data 29922 1726853692.18074: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 29922 1726853692.18075: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 29922 1726853692.18077: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 29922 1726853692.18241: in VariableManager get_vars() 29922 1726853692.18266: done with get_vars() 29922 1726853692.18381: done processing included file 29922 1726853692.18383: iterating over new_blocks loaded from include file 29922 1726853692.18385: in VariableManager get_vars() 29922 1726853692.18396: done with get_vars() 29922 1726853692.18398: filtering new block on tags 29922 1726853692.18413: done filtering new block on tags 29922 1726853692.18416: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node3 29922 1726853692.18421: extending task lists for all hosts with included blocks 29922 1726853692.18492: done extending task lists 29922 1726853692.18493: done processing included files 29922 1726853692.18494: results queue empty 29922 1726853692.18495: checking for any_errors_fatal 29922 1726853692.18497: done checking for any_errors_fatal 29922 1726853692.18497: checking for max_fail_percentage 29922 1726853692.18498: done checking for max_fail_percentage 29922 1726853692.18499: checking to see if all hosts have failed and the running result is not ok 29922 1726853692.18500: done checking to see if all hosts have failed 29922 1726853692.18501: getting the remaining hosts for this loop 29922 1726853692.18502: done getting the remaining hosts for this loop 29922 1726853692.18504: getting the next task for host managed_node3 29922 1726853692.18508: done getting next task for host managed_node3 29922 1726853692.18510: ^ task is: TASK: Include the task 'get_profile_stat.yml' 29922 1726853692.18513: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853692.18515: getting variables 29922 1726853692.18516: in VariableManager get_vars() 29922 1726853692.18525: Calling all_inventory to load vars for managed_node3 29922 1726853692.18527: Calling groups_inventory to load vars for managed_node3 29922 1726853692.18529: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853692.18535: Calling all_plugins_play to load vars for managed_node3 29922 1726853692.18537: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853692.18540: Calling groups_plugins_play to load vars for managed_node3 29922 1726853692.19790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853692.21475: done with get_vars() 29922 1726853692.21501: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 13:34:52 -0400 (0:00:00.091) 0:00:41.145 ****** 29922 1726853692.21587: entering _queue_task() for managed_node3/include_tasks 29922 1726853692.21959: worker is 1 (out of 1 available) 29922 1726853692.21975: exiting _queue_task() for managed_node3/include_tasks 29922 1726853692.21992: done queuing things up, now waiting for results queue to drain 29922 1726853692.21994: waiting for pending results... 29922 1726853692.22386: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 29922 1726853692.22391: in run() - task 02083763-bbaf-51d4-513b-00000000067b 29922 1726853692.22395: variable 'ansible_search_path' from source: unknown 29922 1726853692.22398: variable 'ansible_search_path' from source: unknown 29922 1726853692.22418: calling self._execute() 29922 1726853692.22519: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853692.22532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853692.22546: variable 'omit' from source: magic vars 29922 1726853692.22908: variable 'ansible_distribution_major_version' from source: facts 29922 1726853692.22924: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853692.22932: _execute() done 29922 1726853692.22938: dumping result to json 29922 1726853692.22944: done dumping result, returning 29922 1726853692.22953: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-51d4-513b-00000000067b] 29922 1726853692.22961: sending task result for task 02083763-bbaf-51d4-513b-00000000067b 29922 1726853692.23085: no more pending results, returning what we have 29922 1726853692.23091: in VariableManager get_vars() 29922 1726853692.23126: Calling all_inventory to load vars for managed_node3 29922 1726853692.23129: Calling groups_inventory to load vars for managed_node3 29922 1726853692.23132: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853692.23146: Calling all_plugins_play to load vars for managed_node3 29922 1726853692.23148: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853692.23151: Calling groups_plugins_play to load vars for managed_node3 29922 1726853692.23684: done sending task result for task 02083763-bbaf-51d4-513b-00000000067b 29922 1726853692.23688: WORKER PROCESS EXITING 29922 1726853692.25013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853692.26601: done with get_vars() 29922 1726853692.26622: variable 'ansible_search_path' from source: unknown 29922 1726853692.26623: variable 'ansible_search_path' from source: unknown 29922 1726853692.26663: we have included files to process 29922 1726853692.26664: generating all_blocks data 29922 1726853692.26666: done generating all_blocks data 29922 1726853692.26667: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 29922 1726853692.26668: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 29922 1726853692.26670: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 29922 1726853692.27681: done processing included file 29922 1726853692.27683: iterating over new_blocks loaded from include file 29922 1726853692.27684: in VariableManager get_vars() 29922 1726853692.27698: done with get_vars() 29922 1726853692.27699: filtering new block on tags 29922 1726853692.27722: done filtering new block on tags 29922 1726853692.27725: in VariableManager get_vars() 29922 1726853692.27737: done with get_vars() 29922 1726853692.27738: filtering new block on tags 29922 1726853692.27758: done filtering new block on tags 29922 1726853692.27760: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 29922 1726853692.27765: extending task lists for all hosts with included blocks 29922 1726853692.27867: done extending task lists 29922 1726853692.27868: done processing included files 29922 1726853692.27869: results queue empty 29922 1726853692.27870: checking for any_errors_fatal 29922 1726853692.27875: done checking for any_errors_fatal 29922 1726853692.27876: checking for max_fail_percentage 29922 1726853692.27877: done checking for max_fail_percentage 29922 1726853692.27878: checking to see if all hosts have failed and the running result is not ok 29922 1726853692.27879: done checking to see if all hosts have failed 29922 1726853692.27879: getting the remaining hosts for this loop 29922 1726853692.27880: done getting the remaining hosts for this loop 29922 1726853692.27883: getting the next task for host managed_node3 29922 1726853692.27887: done getting next task for host managed_node3 29922 1726853692.27889: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 29922 1726853692.27892: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853692.27895: getting variables 29922 1726853692.27896: in VariableManager get_vars() 29922 1726853692.27966: Calling all_inventory to load vars for managed_node3 29922 1726853692.27969: Calling groups_inventory to load vars for managed_node3 29922 1726853692.27974: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853692.27980: Calling all_plugins_play to load vars for managed_node3 29922 1726853692.27983: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853692.27985: Calling groups_plugins_play to load vars for managed_node3 29922 1726853692.29103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853692.30690: done with get_vars() 29922 1726853692.30717: done getting variables 29922 1726853692.30761: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:34:52 -0400 (0:00:00.092) 0:00:41.237 ****** 29922 1726853692.30794: entering _queue_task() for managed_node3/set_fact 29922 1726853692.31154: worker is 1 (out of 1 available) 29922 1726853692.31167: exiting _queue_task() for managed_node3/set_fact 29922 1726853692.31381: done queuing things up, now waiting for results queue to drain 29922 1726853692.31383: waiting for pending results... 29922 1726853692.31463: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 29922 1726853692.31600: in run() - task 02083763-bbaf-51d4-513b-00000000068a 29922 1726853692.31627: variable 'ansible_search_path' from source: unknown 29922 1726853692.31635: variable 'ansible_search_path' from source: unknown 29922 1726853692.31680: calling self._execute() 29922 1726853692.31789: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853692.31799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853692.31810: variable 'omit' from source: magic vars 29922 1726853692.32195: variable 'ansible_distribution_major_version' from source: facts 29922 1726853692.32215: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853692.32227: variable 'omit' from source: magic vars 29922 1726853692.32288: variable 'omit' from source: magic vars 29922 1726853692.32374: variable 'omit' from source: magic vars 29922 1726853692.32410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853692.32454: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853692.32486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853692.32510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853692.32527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853692.32606: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853692.32610: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853692.32613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853692.32776: Set connection var ansible_connection to ssh 29922 1726853692.32779: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853692.32782: Set connection var ansible_shell_executable to /bin/sh 29922 1726853692.32784: Set connection var ansible_pipelining to False 29922 1726853692.32786: Set connection var ansible_timeout to 10 29922 1726853692.32788: Set connection var ansible_shell_type to sh 29922 1726853692.32818: variable 'ansible_shell_executable' from source: unknown 29922 1726853692.32826: variable 'ansible_connection' from source: unknown 29922 1726853692.32834: variable 'ansible_module_compression' from source: unknown 29922 1726853692.32842: variable 'ansible_shell_type' from source: unknown 29922 1726853692.32912: variable 'ansible_shell_executable' from source: unknown 29922 1726853692.32915: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853692.32918: variable 'ansible_pipelining' from source: unknown 29922 1726853692.32920: variable 'ansible_timeout' from source: unknown 29922 1726853692.32923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853692.33032: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853692.33050: variable 'omit' from source: magic vars 29922 1726853692.33062: starting attempt loop 29922 1726853692.33069: running the handler 29922 1726853692.33090: handler run complete 29922 1726853692.33106: attempt loop complete, returning result 29922 1726853692.33113: _execute() done 29922 1726853692.33131: dumping result to json 29922 1726853692.33150: done dumping result, returning 29922 1726853692.33162: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-51d4-513b-00000000068a] 29922 1726853692.33173: sending task result for task 02083763-bbaf-51d4-513b-00000000068a 29922 1726853692.33576: done sending task result for task 02083763-bbaf-51d4-513b-00000000068a 29922 1726853692.33580: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 29922 1726853692.33646: no more pending results, returning what we have 29922 1726853692.33649: results queue empty 29922 1726853692.33650: checking for any_errors_fatal 29922 1726853692.33652: done checking for any_errors_fatal 29922 1726853692.33653: checking for max_fail_percentage 29922 1726853692.33654: done checking for max_fail_percentage 29922 1726853692.33655: checking to see if all hosts have failed and the running result is not ok 29922 1726853692.33656: done checking to see if all hosts have failed 29922 1726853692.33657: getting the remaining hosts for this loop 29922 1726853692.33658: done getting the remaining hosts for this loop 29922 1726853692.33662: getting the next task for host managed_node3 29922 1726853692.33667: done getting next task for host managed_node3 29922 1726853692.33670: ^ task is: TASK: Stat profile file 29922 1726853692.33676: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853692.33681: getting variables 29922 1726853692.33682: in VariableManager get_vars() 29922 1726853692.33710: Calling all_inventory to load vars for managed_node3 29922 1726853692.33713: Calling groups_inventory to load vars for managed_node3 29922 1726853692.33716: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853692.33727: Calling all_plugins_play to load vars for managed_node3 29922 1726853692.33729: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853692.33733: Calling groups_plugins_play to load vars for managed_node3 29922 1726853692.35121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853692.36002: done with get_vars() 29922 1726853692.36019: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:34:52 -0400 (0:00:00.052) 0:00:41.290 ****** 29922 1726853692.36087: entering _queue_task() for managed_node3/stat 29922 1726853692.36341: worker is 1 (out of 1 available) 29922 1726853692.36355: exiting _queue_task() for managed_node3/stat 29922 1726853692.36367: done queuing things up, now waiting for results queue to drain 29922 1726853692.36368: waiting for pending results... 29922 1726853692.36547: running TaskExecutor() for managed_node3/TASK: Stat profile file 29922 1726853692.36625: in run() - task 02083763-bbaf-51d4-513b-00000000068b 29922 1726853692.36636: variable 'ansible_search_path' from source: unknown 29922 1726853692.36641: variable 'ansible_search_path' from source: unknown 29922 1726853692.36691: calling self._execute() 29922 1726853692.36977: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853692.36980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853692.36983: variable 'omit' from source: magic vars 29922 1726853692.37252: variable 'ansible_distribution_major_version' from source: facts 29922 1726853692.37269: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853692.37283: variable 'omit' from source: magic vars 29922 1726853692.37342: variable 'omit' from source: magic vars 29922 1726853692.37449: variable 'profile' from source: include params 29922 1726853692.37462: variable 'interface' from source: set_fact 29922 1726853692.37613: variable 'interface' from source: set_fact 29922 1726853692.37654: variable 'omit' from source: magic vars 29922 1726853692.37710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853692.37778: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853692.37787: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853692.37806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853692.37815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853692.37854: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853692.37863: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853692.37880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853692.37949: Set connection var ansible_connection to ssh 29922 1726853692.37958: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853692.37963: Set connection var ansible_shell_executable to /bin/sh 29922 1726853692.37969: Set connection var ansible_pipelining to False 29922 1726853692.37976: Set connection var ansible_timeout to 10 29922 1726853692.37979: Set connection var ansible_shell_type to sh 29922 1726853692.38000: variable 'ansible_shell_executable' from source: unknown 29922 1726853692.38004: variable 'ansible_connection' from source: unknown 29922 1726853692.38007: variable 'ansible_module_compression' from source: unknown 29922 1726853692.38010: variable 'ansible_shell_type' from source: unknown 29922 1726853692.38013: variable 'ansible_shell_executable' from source: unknown 29922 1726853692.38020: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853692.38023: variable 'ansible_pipelining' from source: unknown 29922 1726853692.38026: variable 'ansible_timeout' from source: unknown 29922 1726853692.38028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853692.38180: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853692.38188: variable 'omit' from source: magic vars 29922 1726853692.38194: starting attempt loop 29922 1726853692.38197: running the handler 29922 1726853692.38214: _low_level_execute_command(): starting 29922 1726853692.38217: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853692.38732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853692.38736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853692.38739: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853692.38741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853692.38786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853692.38808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853692.38878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853692.40582: stdout chunk (state=3): >>>/root <<< 29922 1726853692.40735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853692.40738: stdout chunk (state=3): >>><<< 29922 1726853692.40741: stderr chunk (state=3): >>><<< 29922 1726853692.40760: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853692.40860: _low_level_execute_command(): starting 29922 1726853692.40864: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853692.4076717-31825-96688395609971 `" && echo ansible-tmp-1726853692.4076717-31825-96688395609971="` echo /root/.ansible/tmp/ansible-tmp-1726853692.4076717-31825-96688395609971 `" ) && sleep 0' 29922 1726853692.41309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853692.41337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853692.41341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853692.41350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853692.41353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853692.41357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853692.41400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853692.41403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853692.41470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853692.43456: stdout chunk (state=3): >>>ansible-tmp-1726853692.4076717-31825-96688395609971=/root/.ansible/tmp/ansible-tmp-1726853692.4076717-31825-96688395609971 <<< 29922 1726853692.43573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853692.43597: stderr chunk (state=3): >>><<< 29922 1726853692.43601: stdout chunk (state=3): >>><<< 29922 1726853692.43638: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853692.4076717-31825-96688395609971=/root/.ansible/tmp/ansible-tmp-1726853692.4076717-31825-96688395609971 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853692.43693: variable 'ansible_module_compression' from source: unknown 29922 1726853692.43878: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 29922 1726853692.43881: variable 'ansible_facts' from source: unknown 29922 1726853692.43884: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853692.4076717-31825-96688395609971/AnsiballZ_stat.py 29922 1726853692.44094: Sending initial data 29922 1726853692.44099: Sent initial data (152 bytes) 29922 1726853692.44641: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853692.44651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853692.44657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853692.44685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853692.44689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853692.44728: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853692.44781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853692.44788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853692.44790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853692.44852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853692.46465: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 29922 1726853692.46474: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853692.46521: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853692.46608: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpi3xevwgb /root/.ansible/tmp/ansible-tmp-1726853692.4076717-31825-96688395609971/AnsiballZ_stat.py <<< 29922 1726853692.46611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853692.4076717-31825-96688395609971/AnsiballZ_stat.py" <<< 29922 1726853692.46693: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpi3xevwgb" to remote "/root/.ansible/tmp/ansible-tmp-1726853692.4076717-31825-96688395609971/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853692.4076717-31825-96688395609971/AnsiballZ_stat.py" <<< 29922 1726853692.47464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853692.47510: stderr chunk (state=3): >>><<< 29922 1726853692.47514: stdout chunk (state=3): >>><<< 29922 1726853692.47666: done transferring module to remote 29922 1726853692.47676: _low_level_execute_command(): starting 29922 1726853692.47679: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853692.4076717-31825-96688395609971/ /root/.ansible/tmp/ansible-tmp-1726853692.4076717-31825-96688395609971/AnsiballZ_stat.py && sleep 0' 29922 1726853692.48046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853692.48049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 29922 1726853692.48051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29922 1726853692.48057: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853692.48059: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853692.48109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853692.48112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853692.48175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853692.50018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853692.50051: stderr chunk (state=3): >>><<< 29922 1726853692.50054: stdout chunk (state=3): >>><<< 29922 1726853692.50133: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853692.50137: _low_level_execute_command(): starting 29922 1726853692.50139: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853692.4076717-31825-96688395609971/AnsiballZ_stat.py && sleep 0' 29922 1726853692.50674: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853692.50677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853692.50680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853692.50682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853692.50723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853692.50726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853692.50826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853692.66340: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 29922 1726853692.67982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853692.67986: stdout chunk (state=3): >>><<< 29922 1726853692.67988: stderr chunk (state=3): >>><<< 29922 1726853692.67991: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853692.67994: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853692.4076717-31825-96688395609971/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853692.68003: _low_level_execute_command(): starting 29922 1726853692.68008: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853692.4076717-31825-96688395609971/ > /dev/null 2>&1 && sleep 0' 29922 1726853692.69027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853692.69032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853692.69087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853692.69130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853692.69135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853692.69153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853692.69257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853692.71279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853692.71282: stdout chunk (state=3): >>><<< 29922 1726853692.71284: stderr chunk (state=3): >>><<< 29922 1726853692.71580: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853692.71584: handler run complete 29922 1726853692.71586: attempt loop complete, returning result 29922 1726853692.71588: _execute() done 29922 1726853692.71590: dumping result to json 29922 1726853692.71593: done dumping result, returning 29922 1726853692.71595: done running TaskExecutor() for managed_node3/TASK: Stat profile file [02083763-bbaf-51d4-513b-00000000068b] 29922 1726853692.71597: sending task result for task 02083763-bbaf-51d4-513b-00000000068b ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 29922 1726853692.71785: no more pending results, returning what we have 29922 1726853692.71789: results queue empty 29922 1726853692.71790: checking for any_errors_fatal 29922 1726853692.71800: done checking for any_errors_fatal 29922 1726853692.71801: checking for max_fail_percentage 29922 1726853692.71803: done checking for max_fail_percentage 29922 1726853692.71805: checking to see if all hosts have failed and the running result is not ok 29922 1726853692.71806: done checking to see if all hosts have failed 29922 1726853692.71806: getting the remaining hosts for this loop 29922 1726853692.71808: done getting the remaining hosts for this loop 29922 1726853692.71813: getting the next task for host managed_node3 29922 1726853692.71821: done getting next task for host managed_node3 29922 1726853692.71824: ^ task is: TASK: Set NM profile exist flag based on the profile files 29922 1726853692.71829: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853692.71836: getting variables 29922 1726853692.71837: in VariableManager get_vars() 29922 1726853692.71982: Calling all_inventory to load vars for managed_node3 29922 1726853692.71985: Calling groups_inventory to load vars for managed_node3 29922 1726853692.71989: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853692.72004: Calling all_plugins_play to load vars for managed_node3 29922 1726853692.72008: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853692.72011: Calling groups_plugins_play to load vars for managed_node3 29922 1726853692.72789: done sending task result for task 02083763-bbaf-51d4-513b-00000000068b 29922 1726853692.72793: WORKER PROCESS EXITING 29922 1726853692.73740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853692.81842: done with get_vars() 29922 1726853692.81874: done getting variables 29922 1726853692.81925: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:34:52 -0400 (0:00:00.458) 0:00:41.749 ****** 29922 1726853692.81960: entering _queue_task() for managed_node3/set_fact 29922 1726853692.82321: worker is 1 (out of 1 available) 29922 1726853692.82333: exiting _queue_task() for managed_node3/set_fact 29922 1726853692.82344: done queuing things up, now waiting for results queue to drain 29922 1726853692.82345: waiting for pending results... 29922 1726853692.82640: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 29922 1726853692.82768: in run() - task 02083763-bbaf-51d4-513b-00000000068c 29922 1726853692.82785: variable 'ansible_search_path' from source: unknown 29922 1726853692.82788: variable 'ansible_search_path' from source: unknown 29922 1726853692.82832: calling self._execute() 29922 1726853692.82943: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853692.82951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853692.82961: variable 'omit' from source: magic vars 29922 1726853692.83383: variable 'ansible_distribution_major_version' from source: facts 29922 1726853692.83395: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853692.83518: variable 'profile_stat' from source: set_fact 29922 1726853692.83531: Evaluated conditional (profile_stat.stat.exists): False 29922 1726853692.83534: when evaluation is False, skipping this task 29922 1726853692.83537: _execute() done 29922 1726853692.83539: dumping result to json 29922 1726853692.83542: done dumping result, returning 29922 1726853692.83549: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-51d4-513b-00000000068c] 29922 1726853692.83561: sending task result for task 02083763-bbaf-51d4-513b-00000000068c 29922 1726853692.83648: done sending task result for task 02083763-bbaf-51d4-513b-00000000068c 29922 1726853692.83650: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 29922 1726853692.83707: no more pending results, returning what we have 29922 1726853692.83711: results queue empty 29922 1726853692.83712: checking for any_errors_fatal 29922 1726853692.83740: done checking for any_errors_fatal 29922 1726853692.83741: checking for max_fail_percentage 29922 1726853692.83743: done checking for max_fail_percentage 29922 1726853692.83744: checking to see if all hosts have failed and the running result is not ok 29922 1726853692.83744: done checking to see if all hosts have failed 29922 1726853692.83745: getting the remaining hosts for this loop 29922 1726853692.83746: done getting the remaining hosts for this loop 29922 1726853692.83755: getting the next task for host managed_node3 29922 1726853692.83762: done getting next task for host managed_node3 29922 1726853692.83765: ^ task is: TASK: Get NM profile info 29922 1726853692.83770: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853692.83779: getting variables 29922 1726853692.83783: in VariableManager get_vars() 29922 1726853692.83816: Calling all_inventory to load vars for managed_node3 29922 1726853692.83819: Calling groups_inventory to load vars for managed_node3 29922 1726853692.83822: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853692.83837: Calling all_plugins_play to load vars for managed_node3 29922 1726853692.84118: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853692.84124: Calling groups_plugins_play to load vars for managed_node3 29922 1726853692.87765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853692.89927: done with get_vars() 29922 1726853692.89964: done getting variables 29922 1726853692.90078: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:34:52 -0400 (0:00:00.081) 0:00:41.830 ****** 29922 1726853692.90118: entering _queue_task() for managed_node3/shell 29922 1726853692.90120: Creating lock for shell 29922 1726853692.90759: worker is 1 (out of 1 available) 29922 1726853692.90774: exiting _queue_task() for managed_node3/shell 29922 1726853692.90785: done queuing things up, now waiting for results queue to drain 29922 1726853692.90787: waiting for pending results... 29922 1726853692.91032: running TaskExecutor() for managed_node3/TASK: Get NM profile info 29922 1726853692.91192: in run() - task 02083763-bbaf-51d4-513b-00000000068d 29922 1726853692.91377: variable 'ansible_search_path' from source: unknown 29922 1726853692.91381: variable 'ansible_search_path' from source: unknown 29922 1726853692.91384: calling self._execute() 29922 1726853692.91439: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853692.91450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853692.91463: variable 'omit' from source: magic vars 29922 1726853692.92525: variable 'ansible_distribution_major_version' from source: facts 29922 1726853692.92539: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853692.92564: variable 'omit' from source: magic vars 29922 1726853692.92635: variable 'omit' from source: magic vars 29922 1726853692.92806: variable 'profile' from source: include params 29922 1726853692.92830: variable 'interface' from source: set_fact 29922 1726853692.92997: variable 'interface' from source: set_fact 29922 1726853692.93017: variable 'omit' from source: magic vars 29922 1726853692.93115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853692.93182: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853692.93209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853692.93227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853692.93240: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853692.93281: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853692.93291: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853692.93294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853692.93505: Set connection var ansible_connection to ssh 29922 1726853692.93520: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853692.93537: Set connection var ansible_shell_executable to /bin/sh 29922 1726853692.93563: Set connection var ansible_pipelining to False 29922 1726853692.93570: Set connection var ansible_timeout to 10 29922 1726853692.93575: Set connection var ansible_shell_type to sh 29922 1726853692.93603: variable 'ansible_shell_executable' from source: unknown 29922 1726853692.93606: variable 'ansible_connection' from source: unknown 29922 1726853692.93608: variable 'ansible_module_compression' from source: unknown 29922 1726853692.93611: variable 'ansible_shell_type' from source: unknown 29922 1726853692.93613: variable 'ansible_shell_executable' from source: unknown 29922 1726853692.93616: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853692.93618: variable 'ansible_pipelining' from source: unknown 29922 1726853692.93620: variable 'ansible_timeout' from source: unknown 29922 1726853692.93716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853692.93827: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853692.93839: variable 'omit' from source: magic vars 29922 1726853692.93844: starting attempt loop 29922 1726853692.93847: running the handler 29922 1726853692.93862: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853692.93902: _low_level_execute_command(): starting 29922 1726853692.93910: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853692.94791: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853692.94867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853692.94894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853692.94956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853692.96648: stdout chunk (state=3): >>>/root <<< 29922 1726853692.96816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853692.96820: stdout chunk (state=3): >>><<< 29922 1726853692.96823: stderr chunk (state=3): >>><<< 29922 1726853692.96845: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853692.96866: _low_level_execute_command(): starting 29922 1726853692.96885: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853692.9685245-31859-251851618257288 `" && echo ansible-tmp-1726853692.9685245-31859-251851618257288="` echo /root/.ansible/tmp/ansible-tmp-1726853692.9685245-31859-251851618257288 `" ) && sleep 0' 29922 1726853692.97789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853692.97842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853692.97975: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853692.98007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853692.98132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853693.00075: stdout chunk (state=3): >>>ansible-tmp-1726853692.9685245-31859-251851618257288=/root/.ansible/tmp/ansible-tmp-1726853692.9685245-31859-251851618257288 <<< 29922 1726853693.00188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853693.00242: stderr chunk (state=3): >>><<< 29922 1726853693.00256: stdout chunk (state=3): >>><<< 29922 1726853693.00393: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853692.9685245-31859-251851618257288=/root/.ansible/tmp/ansible-tmp-1726853692.9685245-31859-251851618257288 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853693.00403: variable 'ansible_module_compression' from source: unknown 29922 1726853693.00410: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29922 1726853693.00449: variable 'ansible_facts' from source: unknown 29922 1726853693.00522: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853692.9685245-31859-251851618257288/AnsiballZ_command.py 29922 1726853693.00732: Sending initial data 29922 1726853693.00736: Sent initial data (156 bytes) 29922 1726853693.02010: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853693.02014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853693.02017: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853693.02024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853693.02054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853693.02068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853693.02141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853693.03767: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853693.03823: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853693.03892: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpb_7dfypu /root/.ansible/tmp/ansible-tmp-1726853692.9685245-31859-251851618257288/AnsiballZ_command.py <<< 29922 1726853693.03896: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853692.9685245-31859-251851618257288/AnsiballZ_command.py" <<< 29922 1726853693.03943: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpb_7dfypu" to remote "/root/.ansible/tmp/ansible-tmp-1726853692.9685245-31859-251851618257288/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853692.9685245-31859-251851618257288/AnsiballZ_command.py" <<< 29922 1726853693.04923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853693.05022: stderr chunk (state=3): >>><<< 29922 1726853693.05025: stdout chunk (state=3): >>><<< 29922 1726853693.05028: done transferring module to remote 29922 1726853693.05030: _low_level_execute_command(): starting 29922 1726853693.05034: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853692.9685245-31859-251851618257288/ /root/.ansible/tmp/ansible-tmp-1726853692.9685245-31859-251851618257288/AnsiballZ_command.py && sleep 0' 29922 1726853693.05703: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853693.05714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853693.05807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853693.05839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853693.05916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853693.07778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853693.07848: stderr chunk (state=3): >>><<< 29922 1726853693.07852: stdout chunk (state=3): >>><<< 29922 1726853693.07855: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853693.07858: _low_level_execute_command(): starting 29922 1726853693.07980: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853692.9685245-31859-251851618257288/AnsiballZ_command.py && sleep 0' 29922 1726853693.08534: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853693.08545: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853693.08555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853693.08585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853693.08599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853693.08612: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853693.08641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853693.08654: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853693.08666: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853693.08674: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29922 1726853693.08700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853693.08703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853693.08725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853693.08728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853693.08878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853693.08882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853693.08953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853693.25991: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 13:34:53.241449", "end": "2024-09-20 13:34:53.258684", "delta": "0:00:00.017235", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853693.27778: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.217 closed. <<< 29922 1726853693.27783: stdout chunk (state=3): >>><<< 29922 1726853693.27807: stderr chunk (state=3): >>><<< 29922 1726853693.27811: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 13:34:53.241449", "end": "2024-09-20 13:34:53.258684", "delta": "0:00:00.017235", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.217 closed. 29922 1726853693.27814: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853692.9685245-31859-251851618257288/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853693.27817: _low_level_execute_command(): starting 29922 1726853693.27820: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853692.9685245-31859-251851618257288/ > /dev/null 2>&1 && sleep 0' 29922 1726853693.28482: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853693.28530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853693.28591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853693.30529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853693.30539: stdout chunk (state=3): >>><<< 29922 1726853693.30551: stderr chunk (state=3): >>><<< 29922 1726853693.30572: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853693.30592: handler run complete 29922 1726853693.30643: Evaluated conditional (False): False 29922 1726853693.30665: attempt loop complete, returning result 29922 1726853693.30675: _execute() done 29922 1726853693.30683: dumping result to json 29922 1726853693.30691: done dumping result, returning 29922 1726853693.30703: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [02083763-bbaf-51d4-513b-00000000068d] 29922 1726853693.30712: sending task result for task 02083763-bbaf-51d4-513b-00000000068d fatal: [managed_node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.017235", "end": "2024-09-20 13:34:53.258684", "rc": 1, "start": "2024-09-20 13:34:53.241449" } MSG: non-zero return code ...ignoring 29922 1726853693.31145: no more pending results, returning what we have 29922 1726853693.31148: results queue empty 29922 1726853693.31148: checking for any_errors_fatal 29922 1726853693.31153: done checking for any_errors_fatal 29922 1726853693.31153: checking for max_fail_percentage 29922 1726853693.31155: done checking for max_fail_percentage 29922 1726853693.31156: checking to see if all hosts have failed and the running result is not ok 29922 1726853693.31157: done checking to see if all hosts have failed 29922 1726853693.31157: getting the remaining hosts for this loop 29922 1726853693.31158: done getting the remaining hosts for this loop 29922 1726853693.31162: getting the next task for host managed_node3 29922 1726853693.31168: done getting next task for host managed_node3 29922 1726853693.31172: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 29922 1726853693.31176: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853693.31180: getting variables 29922 1726853693.31181: in VariableManager get_vars() 29922 1726853693.31210: Calling all_inventory to load vars for managed_node3 29922 1726853693.31212: Calling groups_inventory to load vars for managed_node3 29922 1726853693.31216: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853693.31227: Calling all_plugins_play to load vars for managed_node3 29922 1726853693.31230: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853693.31232: Calling groups_plugins_play to load vars for managed_node3 29922 1726853693.31789: done sending task result for task 02083763-bbaf-51d4-513b-00000000068d 29922 1726853693.31792: WORKER PROCESS EXITING 29922 1726853693.34122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853693.37541: done with get_vars() 29922 1726853693.37569: done getting variables 29922 1726853693.37682: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:34:53 -0400 (0:00:00.475) 0:00:42.306 ****** 29922 1726853693.37716: entering _queue_task() for managed_node3/set_fact 29922 1726853693.38585: worker is 1 (out of 1 available) 29922 1726853693.38599: exiting _queue_task() for managed_node3/set_fact 29922 1726853693.38611: done queuing things up, now waiting for results queue to drain 29922 1726853693.38612: waiting for pending results... 29922 1726853693.39046: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 29922 1726853693.39166: in run() - task 02083763-bbaf-51d4-513b-00000000068e 29922 1726853693.39289: variable 'ansible_search_path' from source: unknown 29922 1726853693.39292: variable 'ansible_search_path' from source: unknown 29922 1726853693.39323: calling self._execute() 29922 1726853693.39426: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853693.39433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853693.39444: variable 'omit' from source: magic vars 29922 1726853693.40066: variable 'ansible_distribution_major_version' from source: facts 29922 1726853693.40281: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853693.40414: variable 'nm_profile_exists' from source: set_fact 29922 1726853693.40430: Evaluated conditional (nm_profile_exists.rc == 0): False 29922 1726853693.40434: when evaluation is False, skipping this task 29922 1726853693.40436: _execute() done 29922 1726853693.40439: dumping result to json 29922 1726853693.40441: done dumping result, returning 29922 1726853693.40451: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-51d4-513b-00000000068e] 29922 1726853693.40454: sending task result for task 02083763-bbaf-51d4-513b-00000000068e skipping: [managed_node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 29922 1726853693.40744: no more pending results, returning what we have 29922 1726853693.40748: results queue empty 29922 1726853693.40748: checking for any_errors_fatal 29922 1726853693.40757: done checking for any_errors_fatal 29922 1726853693.40757: checking for max_fail_percentage 29922 1726853693.40759: done checking for max_fail_percentage 29922 1726853693.40760: checking to see if all hosts have failed and the running result is not ok 29922 1726853693.40761: done checking to see if all hosts have failed 29922 1726853693.40761: getting the remaining hosts for this loop 29922 1726853693.40762: done getting the remaining hosts for this loop 29922 1726853693.40766: getting the next task for host managed_node3 29922 1726853693.40777: done getting next task for host managed_node3 29922 1726853693.40781: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 29922 1726853693.40786: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853693.40791: getting variables 29922 1726853693.40792: in VariableManager get_vars() 29922 1726853693.40831: Calling all_inventory to load vars for managed_node3 29922 1726853693.40833: Calling groups_inventory to load vars for managed_node3 29922 1726853693.40837: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853693.40851: Calling all_plugins_play to load vars for managed_node3 29922 1726853693.40854: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853693.40858: Calling groups_plugins_play to load vars for managed_node3 29922 1726853693.40940: done sending task result for task 02083763-bbaf-51d4-513b-00000000068e 29922 1726853693.40943: WORKER PROCESS EXITING 29922 1726853693.42389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853693.44946: done with get_vars() 29922 1726853693.44974: done getting variables 29922 1726853693.45035: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853693.45155: variable 'profile' from source: include params 29922 1726853693.45159: variable 'interface' from source: set_fact 29922 1726853693.45222: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:34:53 -0400 (0:00:00.075) 0:00:42.382 ****** 29922 1726853693.45254: entering _queue_task() for managed_node3/command 29922 1726853693.45756: worker is 1 (out of 1 available) 29922 1726853693.45770: exiting _queue_task() for managed_node3/command 29922 1726853693.45786: done queuing things up, now waiting for results queue to drain 29922 1726853693.45787: waiting for pending results... 29922 1726853693.46020: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-ethtest0 29922 1726853693.46134: in run() - task 02083763-bbaf-51d4-513b-000000000690 29922 1726853693.46147: variable 'ansible_search_path' from source: unknown 29922 1726853693.46151: variable 'ansible_search_path' from source: unknown 29922 1726853693.46191: calling self._execute() 29922 1726853693.46303: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853693.46311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853693.46320: variable 'omit' from source: magic vars 29922 1726853693.46723: variable 'ansible_distribution_major_version' from source: facts 29922 1726853693.46728: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853693.46848: variable 'profile_stat' from source: set_fact 29922 1726853693.46851: Evaluated conditional (profile_stat.stat.exists): False 29922 1726853693.46853: when evaluation is False, skipping this task 29922 1726853693.46859: _execute() done 29922 1726853693.46861: dumping result to json 29922 1726853693.46863: done dumping result, returning 29922 1726853693.46866: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [02083763-bbaf-51d4-513b-000000000690] 29922 1726853693.46869: sending task result for task 02083763-bbaf-51d4-513b-000000000690 29922 1726853693.46932: done sending task result for task 02083763-bbaf-51d4-513b-000000000690 29922 1726853693.47051: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 29922 1726853693.47104: no more pending results, returning what we have 29922 1726853693.47108: results queue empty 29922 1726853693.47109: checking for any_errors_fatal 29922 1726853693.47117: done checking for any_errors_fatal 29922 1726853693.47118: checking for max_fail_percentage 29922 1726853693.47119: done checking for max_fail_percentage 29922 1726853693.47120: checking to see if all hosts have failed and the running result is not ok 29922 1726853693.47121: done checking to see if all hosts have failed 29922 1726853693.47122: getting the remaining hosts for this loop 29922 1726853693.47123: done getting the remaining hosts for this loop 29922 1726853693.47126: getting the next task for host managed_node3 29922 1726853693.47132: done getting next task for host managed_node3 29922 1726853693.47135: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 29922 1726853693.47140: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853693.47144: getting variables 29922 1726853693.47145: in VariableManager get_vars() 29922 1726853693.47175: Calling all_inventory to load vars for managed_node3 29922 1726853693.47178: Calling groups_inventory to load vars for managed_node3 29922 1726853693.47181: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853693.47193: Calling all_plugins_play to load vars for managed_node3 29922 1726853693.47196: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853693.47199: Calling groups_plugins_play to load vars for managed_node3 29922 1726853693.49803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853693.50717: done with get_vars() 29922 1726853693.50733: done getting variables 29922 1726853693.50782: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853693.50863: variable 'profile' from source: include params 29922 1726853693.50866: variable 'interface' from source: set_fact 29922 1726853693.50907: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:34:53 -0400 (0:00:00.056) 0:00:42.438 ****** 29922 1726853693.50929: entering _queue_task() for managed_node3/set_fact 29922 1726853693.51185: worker is 1 (out of 1 available) 29922 1726853693.51199: exiting _queue_task() for managed_node3/set_fact 29922 1726853693.51211: done queuing things up, now waiting for results queue to drain 29922 1726853693.51212: waiting for pending results... 29922 1726853693.51384: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 29922 1726853693.51526: in run() - task 02083763-bbaf-51d4-513b-000000000691 29922 1726853693.51532: variable 'ansible_search_path' from source: unknown 29922 1726853693.51535: variable 'ansible_search_path' from source: unknown 29922 1726853693.51552: calling self._execute() 29922 1726853693.51677: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853693.51681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853693.51683: variable 'omit' from source: magic vars 29922 1726853693.52639: variable 'ansible_distribution_major_version' from source: facts 29922 1726853693.52643: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853693.52894: variable 'profile_stat' from source: set_fact 29922 1726853693.52914: Evaluated conditional (profile_stat.stat.exists): False 29922 1726853693.52918: when evaluation is False, skipping this task 29922 1726853693.52920: _execute() done 29922 1726853693.52923: dumping result to json 29922 1726853693.52937: done dumping result, returning 29922 1726853693.52957: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [02083763-bbaf-51d4-513b-000000000691] 29922 1726853693.52961: sending task result for task 02083763-bbaf-51d4-513b-000000000691 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 29922 1726853693.53137: no more pending results, returning what we have 29922 1726853693.53141: results queue empty 29922 1726853693.53170: checking for any_errors_fatal 29922 1726853693.53181: done checking for any_errors_fatal 29922 1726853693.53182: checking for max_fail_percentage 29922 1726853693.53184: done checking for max_fail_percentage 29922 1726853693.53185: checking to see if all hosts have failed and the running result is not ok 29922 1726853693.53185: done checking to see if all hosts have failed 29922 1726853693.53186: getting the remaining hosts for this loop 29922 1726853693.53187: done getting the remaining hosts for this loop 29922 1726853693.53192: getting the next task for host managed_node3 29922 1726853693.53198: done getting next task for host managed_node3 29922 1726853693.53200: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 29922 1726853693.53204: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853693.53208: getting variables 29922 1726853693.53210: in VariableManager get_vars() 29922 1726853693.53246: Calling all_inventory to load vars for managed_node3 29922 1726853693.53249: Calling groups_inventory to load vars for managed_node3 29922 1726853693.53374: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853693.53405: Calling all_plugins_play to load vars for managed_node3 29922 1726853693.53409: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853693.53413: Calling groups_plugins_play to load vars for managed_node3 29922 1726853693.53989: done sending task result for task 02083763-bbaf-51d4-513b-000000000691 29922 1726853693.53993: WORKER PROCESS EXITING 29922 1726853693.54694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853693.56405: done with get_vars() 29922 1726853693.56429: done getting variables 29922 1726853693.56498: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853693.56606: variable 'profile' from source: include params 29922 1726853693.56609: variable 'interface' from source: set_fact 29922 1726853693.56675: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:34:53 -0400 (0:00:00.057) 0:00:42.496 ****** 29922 1726853693.56720: entering _queue_task() for managed_node3/command 29922 1726853693.57051: worker is 1 (out of 1 available) 29922 1726853693.57063: exiting _queue_task() for managed_node3/command 29922 1726853693.57078: done queuing things up, now waiting for results queue to drain 29922 1726853693.57080: waiting for pending results... 29922 1726853693.57273: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-ethtest0 29922 1726853693.57346: in run() - task 02083763-bbaf-51d4-513b-000000000692 29922 1726853693.57358: variable 'ansible_search_path' from source: unknown 29922 1726853693.57362: variable 'ansible_search_path' from source: unknown 29922 1726853693.57395: calling self._execute() 29922 1726853693.57483: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853693.57487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853693.57497: variable 'omit' from source: magic vars 29922 1726853693.57779: variable 'ansible_distribution_major_version' from source: facts 29922 1726853693.57788: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853693.57872: variable 'profile_stat' from source: set_fact 29922 1726853693.57884: Evaluated conditional (profile_stat.stat.exists): False 29922 1726853693.57888: when evaluation is False, skipping this task 29922 1726853693.57891: _execute() done 29922 1726853693.57894: dumping result to json 29922 1726853693.57896: done dumping result, returning 29922 1726853693.57901: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-ethtest0 [02083763-bbaf-51d4-513b-000000000692] 29922 1726853693.57905: sending task result for task 02083763-bbaf-51d4-513b-000000000692 29922 1726853693.57986: done sending task result for task 02083763-bbaf-51d4-513b-000000000692 29922 1726853693.57989: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 29922 1726853693.58037: no more pending results, returning what we have 29922 1726853693.58040: results queue empty 29922 1726853693.58041: checking for any_errors_fatal 29922 1726853693.58049: done checking for any_errors_fatal 29922 1726853693.58050: checking for max_fail_percentage 29922 1726853693.58052: done checking for max_fail_percentage 29922 1726853693.58053: checking to see if all hosts have failed and the running result is not ok 29922 1726853693.58053: done checking to see if all hosts have failed 29922 1726853693.58054: getting the remaining hosts for this loop 29922 1726853693.58055: done getting the remaining hosts for this loop 29922 1726853693.58059: getting the next task for host managed_node3 29922 1726853693.58064: done getting next task for host managed_node3 29922 1726853693.58066: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 29922 1726853693.58073: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853693.58078: getting variables 29922 1726853693.58079: in VariableManager get_vars() 29922 1726853693.58114: Calling all_inventory to load vars for managed_node3 29922 1726853693.58116: Calling groups_inventory to load vars for managed_node3 29922 1726853693.58120: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853693.58130: Calling all_plugins_play to load vars for managed_node3 29922 1726853693.58132: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853693.58134: Calling groups_plugins_play to load vars for managed_node3 29922 1726853693.59193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853693.60769: done with get_vars() 29922 1726853693.60790: done getting variables 29922 1726853693.60850: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853693.60960: variable 'profile' from source: include params 29922 1726853693.60964: variable 'interface' from source: set_fact 29922 1726853693.61097: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:34:53 -0400 (0:00:00.044) 0:00:42.541 ****** 29922 1726853693.61166: entering _queue_task() for managed_node3/set_fact 29922 1726853693.61799: worker is 1 (out of 1 available) 29922 1726853693.61811: exiting _queue_task() for managed_node3/set_fact 29922 1726853693.61827: done queuing things up, now waiting for results queue to drain 29922 1726853693.61828: waiting for pending results... 29922 1726853693.62061: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-ethtest0 29922 1726853693.62154: in run() - task 02083763-bbaf-51d4-513b-000000000693 29922 1726853693.62169: variable 'ansible_search_path' from source: unknown 29922 1726853693.62175: variable 'ansible_search_path' from source: unknown 29922 1726853693.62202: calling self._execute() 29922 1726853693.62281: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853693.62287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853693.62296: variable 'omit' from source: magic vars 29922 1726853693.62570: variable 'ansible_distribution_major_version' from source: facts 29922 1726853693.62582: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853693.62663: variable 'profile_stat' from source: set_fact 29922 1726853693.62684: Evaluated conditional (profile_stat.stat.exists): False 29922 1726853693.62688: when evaluation is False, skipping this task 29922 1726853693.62693: _execute() done 29922 1726853693.62696: dumping result to json 29922 1726853693.62699: done dumping result, returning 29922 1726853693.62702: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [02083763-bbaf-51d4-513b-000000000693] 29922 1726853693.62704: sending task result for task 02083763-bbaf-51d4-513b-000000000693 29922 1726853693.62785: done sending task result for task 02083763-bbaf-51d4-513b-000000000693 29922 1726853693.62788: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 29922 1726853693.62874: no more pending results, returning what we have 29922 1726853693.62878: results queue empty 29922 1726853693.62878: checking for any_errors_fatal 29922 1726853693.62883: done checking for any_errors_fatal 29922 1726853693.62884: checking for max_fail_percentage 29922 1726853693.62885: done checking for max_fail_percentage 29922 1726853693.62886: checking to see if all hosts have failed and the running result is not ok 29922 1726853693.62887: done checking to see if all hosts have failed 29922 1726853693.62887: getting the remaining hosts for this loop 29922 1726853693.62889: done getting the remaining hosts for this loop 29922 1726853693.62892: getting the next task for host managed_node3 29922 1726853693.62898: done getting next task for host managed_node3 29922 1726853693.62900: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 29922 1726853693.62902: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853693.62906: getting variables 29922 1726853693.62907: in VariableManager get_vars() 29922 1726853693.62933: Calling all_inventory to load vars for managed_node3 29922 1726853693.62935: Calling groups_inventory to load vars for managed_node3 29922 1726853693.62937: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853693.62947: Calling all_plugins_play to load vars for managed_node3 29922 1726853693.62949: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853693.62951: Calling groups_plugins_play to load vars for managed_node3 29922 1726853693.64129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853693.66309: done with get_vars() 29922 1726853693.66340: done getting variables 29922 1726853693.66415: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853693.66553: variable 'profile' from source: include params 29922 1726853693.66557: variable 'interface' from source: set_fact 29922 1726853693.66620: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 13:34:53 -0400 (0:00:00.054) 0:00:42.596 ****** 29922 1726853693.66654: entering _queue_task() for managed_node3/assert 29922 1726853693.67081: worker is 1 (out of 1 available) 29922 1726853693.67095: exiting _queue_task() for managed_node3/assert 29922 1726853693.67109: done queuing things up, now waiting for results queue to drain 29922 1726853693.67111: waiting for pending results... 29922 1726853693.67526: running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'ethtest0' 29922 1726853693.67742: in run() - task 02083763-bbaf-51d4-513b-00000000067c 29922 1726853693.67758: variable 'ansible_search_path' from source: unknown 29922 1726853693.67762: variable 'ansible_search_path' from source: unknown 29922 1726853693.67842: calling self._execute() 29922 1726853693.67958: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853693.67968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853693.67982: variable 'omit' from source: magic vars 29922 1726853693.69135: variable 'ansible_distribution_major_version' from source: facts 29922 1726853693.69147: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853693.69154: variable 'omit' from source: magic vars 29922 1726853693.69278: variable 'omit' from source: magic vars 29922 1726853693.69550: variable 'profile' from source: include params 29922 1726853693.69555: variable 'interface' from source: set_fact 29922 1726853693.69657: variable 'interface' from source: set_fact 29922 1726853693.69682: variable 'omit' from source: magic vars 29922 1726853693.69725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853693.69776: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853693.69798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853693.69817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853693.69834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853693.70087: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853693.70090: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853693.70093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853693.70325: Set connection var ansible_connection to ssh 29922 1726853693.70335: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853693.70344: Set connection var ansible_shell_executable to /bin/sh 29922 1726853693.70351: Set connection var ansible_pipelining to False 29922 1726853693.70356: Set connection var ansible_timeout to 10 29922 1726853693.70363: Set connection var ansible_shell_type to sh 29922 1726853693.70583: variable 'ansible_shell_executable' from source: unknown 29922 1726853693.70590: variable 'ansible_connection' from source: unknown 29922 1726853693.70598: variable 'ansible_module_compression' from source: unknown 29922 1726853693.70604: variable 'ansible_shell_type' from source: unknown 29922 1726853693.70610: variable 'ansible_shell_executable' from source: unknown 29922 1726853693.70615: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853693.70622: variable 'ansible_pipelining' from source: unknown 29922 1726853693.70634: variable 'ansible_timeout' from source: unknown 29922 1726853693.70740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853693.70795: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853693.70812: variable 'omit' from source: magic vars 29922 1726853693.70823: starting attempt loop 29922 1726853693.70829: running the handler 29922 1726853693.70961: variable 'lsr_net_profile_exists' from source: set_fact 29922 1726853693.70976: Evaluated conditional (not lsr_net_profile_exists): True 29922 1726853693.70988: handler run complete 29922 1726853693.71006: attempt loop complete, returning result 29922 1726853693.71013: _execute() done 29922 1726853693.71019: dumping result to json 29922 1726853693.71026: done dumping result, returning 29922 1726853693.71037: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is absent - 'ethtest0' [02083763-bbaf-51d4-513b-00000000067c] 29922 1726853693.71045: sending task result for task 02083763-bbaf-51d4-513b-00000000067c ok: [managed_node3] => { "changed": false } MSG: All assertions passed 29922 1726853693.71218: no more pending results, returning what we have 29922 1726853693.71222: results queue empty 29922 1726853693.71223: checking for any_errors_fatal 29922 1726853693.71231: done checking for any_errors_fatal 29922 1726853693.71232: checking for max_fail_percentage 29922 1726853693.71234: done checking for max_fail_percentage 29922 1726853693.71235: checking to see if all hosts have failed and the running result is not ok 29922 1726853693.71236: done checking to see if all hosts have failed 29922 1726853693.71237: getting the remaining hosts for this loop 29922 1726853693.71238: done getting the remaining hosts for this loop 29922 1726853693.71242: getting the next task for host managed_node3 29922 1726853693.71251: done getting next task for host managed_node3 29922 1726853693.71254: ^ task is: TASK: Include the task 'assert_device_absent.yml' 29922 1726853693.71257: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853693.71261: getting variables 29922 1726853693.71263: in VariableManager get_vars() 29922 1726853693.71298: Calling all_inventory to load vars for managed_node3 29922 1726853693.71301: Calling groups_inventory to load vars for managed_node3 29922 1726853693.71305: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853693.71317: Calling all_plugins_play to load vars for managed_node3 29922 1726853693.71322: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853693.71324: Calling groups_plugins_play to load vars for managed_node3 29922 1726853693.72493: done sending task result for task 02083763-bbaf-51d4-513b-00000000067c 29922 1726853693.72497: WORKER PROCESS EXITING 29922 1726853693.72534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853693.73422: done with get_vars() 29922 1726853693.73439: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:234 Friday 20 September 2024 13:34:53 -0400 (0:00:00.068) 0:00:42.664 ****** 29922 1726853693.73508: entering _queue_task() for managed_node3/include_tasks 29922 1726853693.73794: worker is 1 (out of 1 available) 29922 1726853693.73807: exiting _queue_task() for managed_node3/include_tasks 29922 1726853693.73819: done queuing things up, now waiting for results queue to drain 29922 1726853693.73821: waiting for pending results... 29922 1726853693.74292: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_absent.yml' 29922 1726853693.74301: in run() - task 02083763-bbaf-51d4-513b-0000000000aa 29922 1726853693.74305: variable 'ansible_search_path' from source: unknown 29922 1726853693.74308: calling self._execute() 29922 1726853693.74348: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853693.74363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853693.74368: variable 'omit' from source: magic vars 29922 1726853693.74776: variable 'ansible_distribution_major_version' from source: facts 29922 1726853693.74780: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853693.74787: _execute() done 29922 1726853693.74790: dumping result to json 29922 1726853693.74793: done dumping result, returning 29922 1726853693.74796: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_device_absent.yml' [02083763-bbaf-51d4-513b-0000000000aa] 29922 1726853693.74799: sending task result for task 02083763-bbaf-51d4-513b-0000000000aa 29922 1726853693.74864: done sending task result for task 02083763-bbaf-51d4-513b-0000000000aa 29922 1726853693.74867: WORKER PROCESS EXITING 29922 1726853693.74896: no more pending results, returning what we have 29922 1726853693.74901: in VariableManager get_vars() 29922 1726853693.74936: Calling all_inventory to load vars for managed_node3 29922 1726853693.74938: Calling groups_inventory to load vars for managed_node3 29922 1726853693.74942: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853693.74955: Calling all_plugins_play to load vars for managed_node3 29922 1726853693.74959: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853693.74962: Calling groups_plugins_play to load vars for managed_node3 29922 1726853693.76692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853693.78180: done with get_vars() 29922 1726853693.78202: variable 'ansible_search_path' from source: unknown 29922 1726853693.78217: we have included files to process 29922 1726853693.78218: generating all_blocks data 29922 1726853693.78220: done generating all_blocks data 29922 1726853693.78224: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 29922 1726853693.78225: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 29922 1726853693.78227: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 29922 1726853693.78376: in VariableManager get_vars() 29922 1726853693.78392: done with get_vars() 29922 1726853693.78499: done processing included file 29922 1726853693.78501: iterating over new_blocks loaded from include file 29922 1726853693.78503: in VariableManager get_vars() 29922 1726853693.78512: done with get_vars() 29922 1726853693.78513: filtering new block on tags 29922 1726853693.78528: done filtering new block on tags 29922 1726853693.78530: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node3 29922 1726853693.78535: extending task lists for all hosts with included blocks 29922 1726853693.78683: done extending task lists 29922 1726853693.78684: done processing included files 29922 1726853693.78685: results queue empty 29922 1726853693.78686: checking for any_errors_fatal 29922 1726853693.78689: done checking for any_errors_fatal 29922 1726853693.78690: checking for max_fail_percentage 29922 1726853693.78691: done checking for max_fail_percentage 29922 1726853693.78691: checking to see if all hosts have failed and the running result is not ok 29922 1726853693.78692: done checking to see if all hosts have failed 29922 1726853693.78693: getting the remaining hosts for this loop 29922 1726853693.78694: done getting the remaining hosts for this loop 29922 1726853693.78696: getting the next task for host managed_node3 29922 1726853693.78699: done getting next task for host managed_node3 29922 1726853693.78701: ^ task is: TASK: Include the task 'get_interface_stat.yml' 29922 1726853693.78703: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853693.78705: getting variables 29922 1726853693.78706: in VariableManager get_vars() 29922 1726853693.78713: Calling all_inventory to load vars for managed_node3 29922 1726853693.78715: Calling groups_inventory to load vars for managed_node3 29922 1726853693.78717: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853693.78722: Calling all_plugins_play to load vars for managed_node3 29922 1726853693.78724: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853693.78726: Calling groups_plugins_play to load vars for managed_node3 29922 1726853693.79875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853693.81382: done with get_vars() 29922 1726853693.81409: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 13:34:53 -0400 (0:00:00.079) 0:00:42.744 ****** 29922 1726853693.81486: entering _queue_task() for managed_node3/include_tasks 29922 1726853693.81841: worker is 1 (out of 1 available) 29922 1726853693.81853: exiting _queue_task() for managed_node3/include_tasks 29922 1726853693.81865: done queuing things up, now waiting for results queue to drain 29922 1726853693.81866: waiting for pending results... 29922 1726853693.82229: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 29922 1726853693.82300: in run() - task 02083763-bbaf-51d4-513b-0000000006c4 29922 1726853693.82305: variable 'ansible_search_path' from source: unknown 29922 1726853693.82308: variable 'ansible_search_path' from source: unknown 29922 1726853693.82418: calling self._execute() 29922 1726853693.82459: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853693.82462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853693.82465: variable 'omit' from source: magic vars 29922 1726853693.82873: variable 'ansible_distribution_major_version' from source: facts 29922 1726853693.82946: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853693.82950: _execute() done 29922 1726853693.82952: dumping result to json 29922 1726853693.82957: done dumping result, returning 29922 1726853693.82959: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-51d4-513b-0000000006c4] 29922 1726853693.82961: sending task result for task 02083763-bbaf-51d4-513b-0000000006c4 29922 1726853693.83025: done sending task result for task 02083763-bbaf-51d4-513b-0000000006c4 29922 1726853693.83027: WORKER PROCESS EXITING 29922 1726853693.83285: no more pending results, returning what we have 29922 1726853693.83290: in VariableManager get_vars() 29922 1726853693.83324: Calling all_inventory to load vars for managed_node3 29922 1726853693.83327: Calling groups_inventory to load vars for managed_node3 29922 1726853693.83330: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853693.83342: Calling all_plugins_play to load vars for managed_node3 29922 1726853693.83345: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853693.83348: Calling groups_plugins_play to load vars for managed_node3 29922 1726853693.85134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853693.86867: done with get_vars() 29922 1726853693.86894: variable 'ansible_search_path' from source: unknown 29922 1726853693.86895: variable 'ansible_search_path' from source: unknown 29922 1726853693.86933: we have included files to process 29922 1726853693.86934: generating all_blocks data 29922 1726853693.86936: done generating all_blocks data 29922 1726853693.86937: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 29922 1726853693.86938: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 29922 1726853693.86941: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 29922 1726853693.87130: done processing included file 29922 1726853693.87132: iterating over new_blocks loaded from include file 29922 1726853693.87134: in VariableManager get_vars() 29922 1726853693.87147: done with get_vars() 29922 1726853693.87149: filtering new block on tags 29922 1726853693.87163: done filtering new block on tags 29922 1726853693.87166: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 29922 1726853693.87177: extending task lists for all hosts with included blocks 29922 1726853693.87253: done extending task lists 29922 1726853693.87254: done processing included files 29922 1726853693.87255: results queue empty 29922 1726853693.87255: checking for any_errors_fatal 29922 1726853693.87258: done checking for any_errors_fatal 29922 1726853693.87259: checking for max_fail_percentage 29922 1726853693.87260: done checking for max_fail_percentage 29922 1726853693.87260: checking to see if all hosts have failed and the running result is not ok 29922 1726853693.87261: done checking to see if all hosts have failed 29922 1726853693.87262: getting the remaining hosts for this loop 29922 1726853693.87263: done getting the remaining hosts for this loop 29922 1726853693.87265: getting the next task for host managed_node3 29922 1726853693.87268: done getting next task for host managed_node3 29922 1726853693.87270: ^ task is: TASK: Get stat for interface {{ interface }} 29922 1726853693.87275: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853693.87278: getting variables 29922 1726853693.87278: in VariableManager get_vars() 29922 1726853693.87286: Calling all_inventory to load vars for managed_node3 29922 1726853693.87288: Calling groups_inventory to load vars for managed_node3 29922 1726853693.87289: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853693.87295: Calling all_plugins_play to load vars for managed_node3 29922 1726853693.87296: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853693.87299: Calling groups_plugins_play to load vars for managed_node3 29922 1726853693.88475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853693.90049: done with get_vars() 29922 1726853693.90072: done getting variables 29922 1726853693.90228: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:34:53 -0400 (0:00:00.087) 0:00:42.832 ****** 29922 1726853693.90260: entering _queue_task() for managed_node3/stat 29922 1726853693.90608: worker is 1 (out of 1 available) 29922 1726853693.90622: exiting _queue_task() for managed_node3/stat 29922 1726853693.90635: done queuing things up, now waiting for results queue to drain 29922 1726853693.90637: waiting for pending results... 29922 1726853693.91187: running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest0 29922 1726853693.91192: in run() - task 02083763-bbaf-51d4-513b-0000000006de 29922 1726853693.91195: variable 'ansible_search_path' from source: unknown 29922 1726853693.91198: variable 'ansible_search_path' from source: unknown 29922 1726853693.91203: calling self._execute() 29922 1726853693.91206: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853693.91209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853693.91213: variable 'omit' from source: magic vars 29922 1726853693.91777: variable 'ansible_distribution_major_version' from source: facts 29922 1726853693.91781: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853693.91783: variable 'omit' from source: magic vars 29922 1726853693.91785: variable 'omit' from source: magic vars 29922 1726853693.91789: variable 'interface' from source: set_fact 29922 1726853693.91792: variable 'omit' from source: magic vars 29922 1726853693.91976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853693.91980: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853693.91984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853693.91987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853693.91991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853693.91994: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853693.91997: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853693.92000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853693.92035: Set connection var ansible_connection to ssh 29922 1726853693.92043: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853693.92052: Set connection var ansible_shell_executable to /bin/sh 29922 1726853693.92059: Set connection var ansible_pipelining to False 29922 1726853693.92066: Set connection var ansible_timeout to 10 29922 1726853693.92069: Set connection var ansible_shell_type to sh 29922 1726853693.92096: variable 'ansible_shell_executable' from source: unknown 29922 1726853693.92100: variable 'ansible_connection' from source: unknown 29922 1726853693.92103: variable 'ansible_module_compression' from source: unknown 29922 1726853693.92105: variable 'ansible_shell_type' from source: unknown 29922 1726853693.92108: variable 'ansible_shell_executable' from source: unknown 29922 1726853693.92110: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853693.92115: variable 'ansible_pipelining' from source: unknown 29922 1726853693.92117: variable 'ansible_timeout' from source: unknown 29922 1726853693.92122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853693.92377: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 29922 1726853693.92382: variable 'omit' from source: magic vars 29922 1726853693.92385: starting attempt loop 29922 1726853693.92388: running the handler 29922 1726853693.92391: _low_level_execute_command(): starting 29922 1726853693.92394: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853693.93523: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853693.93598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853693.93608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853693.93631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853693.93820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853693.95533: stdout chunk (state=3): >>>/root <<< 29922 1726853693.95743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853693.95747: stdout chunk (state=3): >>><<< 29922 1726853693.95759: stderr chunk (state=3): >>><<< 29922 1726853693.95782: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853693.95806: _low_level_execute_command(): starting 29922 1726853693.95813: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853693.9578335-31911-144477972262814 `" && echo ansible-tmp-1726853693.9578335-31911-144477972262814="` echo /root/.ansible/tmp/ansible-tmp-1726853693.9578335-31911-144477972262814 `" ) && sleep 0' 29922 1726853693.97276: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853693.97280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853693.97283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853693.97285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853693.97287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853693.97290: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853693.97301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853693.97444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853693.97513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853693.99597: stdout chunk (state=3): >>>ansible-tmp-1726853693.9578335-31911-144477972262814=/root/.ansible/tmp/ansible-tmp-1726853693.9578335-31911-144477972262814 <<< 29922 1726853693.99651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853693.99657: stdout chunk (state=3): >>><<< 29922 1726853693.99660: stderr chunk (state=3): >>><<< 29922 1726853693.99683: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853693.9578335-31911-144477972262814=/root/.ansible/tmp/ansible-tmp-1726853693.9578335-31911-144477972262814 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853693.99738: variable 'ansible_module_compression' from source: unknown 29922 1726853693.99792: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 29922 1726853694.00079: variable 'ansible_facts' from source: unknown 29922 1726853694.00139: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853693.9578335-31911-144477972262814/AnsiballZ_stat.py 29922 1726853694.00364: Sending initial data 29922 1726853694.00368: Sent initial data (153 bytes) 29922 1726853694.01669: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853694.01685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853694.01765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853694.01886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853694.01982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853694.03786: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853694.03844: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853694.03927: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpn67twd35 /root/.ansible/tmp/ansible-tmp-1726853693.9578335-31911-144477972262814/AnsiballZ_stat.py <<< 29922 1726853694.03942: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853693.9578335-31911-144477972262814/AnsiballZ_stat.py" <<< 29922 1726853694.04020: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmpn67twd35" to remote "/root/.ansible/tmp/ansible-tmp-1726853693.9578335-31911-144477972262814/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853693.9578335-31911-144477972262814/AnsiballZ_stat.py" <<< 29922 1726853694.05347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853694.05441: stderr chunk (state=3): >>><<< 29922 1726853694.05445: stdout chunk (state=3): >>><<< 29922 1726853694.05454: done transferring module to remote 29922 1726853694.05475: _low_level_execute_command(): starting 29922 1726853694.05582: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853693.9578335-31911-144477972262814/ /root/.ansible/tmp/ansible-tmp-1726853693.9578335-31911-144477972262814/AnsiballZ_stat.py && sleep 0' 29922 1726853694.06658: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853694.06662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853694.06677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853694.06877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853694.06882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853694.06884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853694.06953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853694.07043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853694.08962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853694.08973: stdout chunk (state=3): >>><<< 29922 1726853694.08976: stderr chunk (state=3): >>><<< 29922 1726853694.09020: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853694.09023: _low_level_execute_command(): starting 29922 1726853694.09178: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853693.9578335-31911-144477972262814/AnsiballZ_stat.py && sleep 0' 29922 1726853694.10241: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853694.10250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853694.10261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853694.10277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853694.10290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853694.10297: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853694.10313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853694.10326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853694.10423: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853694.10531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853694.10545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853694.10876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853694.26108: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 29922 1726853694.27637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853694.27642: stdout chunk (state=3): >>><<< 29922 1726853694.27646: stderr chunk (state=3): >>><<< 29922 1726853694.27667: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853694.27701: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853693.9578335-31911-144477972262814/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853694.27709: _low_level_execute_command(): starting 29922 1726853694.27714: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853693.9578335-31911-144477972262814/ > /dev/null 2>&1 && sleep 0' 29922 1726853694.29094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853694.29221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853694.29237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853694.29320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853694.31209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853694.31392: stderr chunk (state=3): >>><<< 29922 1726853694.31395: stdout chunk (state=3): >>><<< 29922 1726853694.31414: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853694.31420: handler run complete 29922 1726853694.31443: attempt loop complete, returning result 29922 1726853694.31446: _execute() done 29922 1726853694.31449: dumping result to json 29922 1726853694.31451: done dumping result, returning 29922 1726853694.31461: done running TaskExecutor() for managed_node3/TASK: Get stat for interface ethtest0 [02083763-bbaf-51d4-513b-0000000006de] 29922 1726853694.31464: sending task result for task 02083763-bbaf-51d4-513b-0000000006de 29922 1726853694.31680: done sending task result for task 02083763-bbaf-51d4-513b-0000000006de 29922 1726853694.31683: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 29922 1726853694.31774: no more pending results, returning what we have 29922 1726853694.31778: results queue empty 29922 1726853694.31779: checking for any_errors_fatal 29922 1726853694.31781: done checking for any_errors_fatal 29922 1726853694.31782: checking for max_fail_percentage 29922 1726853694.31784: done checking for max_fail_percentage 29922 1726853694.31785: checking to see if all hosts have failed and the running result is not ok 29922 1726853694.31786: done checking to see if all hosts have failed 29922 1726853694.31787: getting the remaining hosts for this loop 29922 1726853694.31789: done getting the remaining hosts for this loop 29922 1726853694.31794: getting the next task for host managed_node3 29922 1726853694.31803: done getting next task for host managed_node3 29922 1726853694.31806: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 29922 1726853694.31810: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853694.31816: getting variables 29922 1726853694.31817: in VariableManager get_vars() 29922 1726853694.31852: Calling all_inventory to load vars for managed_node3 29922 1726853694.31855: Calling groups_inventory to load vars for managed_node3 29922 1726853694.31858: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853694.32077: Calling all_plugins_play to load vars for managed_node3 29922 1726853694.32082: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853694.32086: Calling groups_plugins_play to load vars for managed_node3 29922 1726853694.34417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853694.35976: done with get_vars() 29922 1726853694.36006: done getting variables 29922 1726853694.36078: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 29922 1726853694.36198: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 13:34:54 -0400 (0:00:00.459) 0:00:43.291 ****** 29922 1726853694.36229: entering _queue_task() for managed_node3/assert 29922 1726853694.36700: worker is 1 (out of 1 available) 29922 1726853694.36716: exiting _queue_task() for managed_node3/assert 29922 1726853694.36726: done queuing things up, now waiting for results queue to drain 29922 1726853694.36728: waiting for pending results... 29922 1726853694.37053: running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'ethtest0' 29922 1726853694.37070: in run() - task 02083763-bbaf-51d4-513b-0000000006c5 29922 1726853694.37098: variable 'ansible_search_path' from source: unknown 29922 1726853694.37107: variable 'ansible_search_path' from source: unknown 29922 1726853694.37153: calling self._execute() 29922 1726853694.37261: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853694.37276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853694.37304: variable 'omit' from source: magic vars 29922 1726853694.37676: variable 'ansible_distribution_major_version' from source: facts 29922 1726853694.37699: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853694.37801: variable 'omit' from source: magic vars 29922 1726853694.37804: variable 'omit' from source: magic vars 29922 1726853694.37857: variable 'interface' from source: set_fact 29922 1726853694.37884: variable 'omit' from source: magic vars 29922 1726853694.38273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853694.38278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853694.38280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853694.38282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853694.38284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853694.38286: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853694.38287: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853694.38289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853694.38461: Set connection var ansible_connection to ssh 29922 1726853694.38478: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853694.38491: Set connection var ansible_shell_executable to /bin/sh 29922 1726853694.38521: Set connection var ansible_pipelining to False 29922 1726853694.38584: Set connection var ansible_timeout to 10 29922 1726853694.38592: Set connection var ansible_shell_type to sh 29922 1726853694.38624: variable 'ansible_shell_executable' from source: unknown 29922 1726853694.38633: variable 'ansible_connection' from source: unknown 29922 1726853694.38641: variable 'ansible_module_compression' from source: unknown 29922 1726853694.38647: variable 'ansible_shell_type' from source: unknown 29922 1726853694.38878: variable 'ansible_shell_executable' from source: unknown 29922 1726853694.38881: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853694.38883: variable 'ansible_pipelining' from source: unknown 29922 1726853694.38885: variable 'ansible_timeout' from source: unknown 29922 1726853694.38888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853694.38948: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853694.39177: variable 'omit' from source: magic vars 29922 1726853694.39180: starting attempt loop 29922 1726853694.39182: running the handler 29922 1726853694.39264: variable 'interface_stat' from source: set_fact 29922 1726853694.39388: Evaluated conditional (not interface_stat.stat.exists): True 29922 1726853694.39398: handler run complete 29922 1726853694.39420: attempt loop complete, returning result 29922 1726853694.39426: _execute() done 29922 1726853694.39433: dumping result to json 29922 1726853694.39440: done dumping result, returning 29922 1726853694.39492: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is absent - 'ethtest0' [02083763-bbaf-51d4-513b-0000000006c5] 29922 1726853694.39501: sending task result for task 02083763-bbaf-51d4-513b-0000000006c5 29922 1726853694.39608: done sending task result for task 02083763-bbaf-51d4-513b-0000000006c5 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 29922 1726853694.39681: no more pending results, returning what we have 29922 1726853694.39684: results queue empty 29922 1726853694.39685: checking for any_errors_fatal 29922 1726853694.39696: done checking for any_errors_fatal 29922 1726853694.39697: checking for max_fail_percentage 29922 1726853694.39699: done checking for max_fail_percentage 29922 1726853694.39700: checking to see if all hosts have failed and the running result is not ok 29922 1726853694.39701: done checking to see if all hosts have failed 29922 1726853694.39702: getting the remaining hosts for this loop 29922 1726853694.39703: done getting the remaining hosts for this loop 29922 1726853694.39707: getting the next task for host managed_node3 29922 1726853694.39715: done getting next task for host managed_node3 29922 1726853694.39719: ^ task is: TASK: Verify network state restored to default 29922 1726853694.39721: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853694.39725: getting variables 29922 1726853694.39726: in VariableManager get_vars() 29922 1726853694.39757: Calling all_inventory to load vars for managed_node3 29922 1726853694.39759: Calling groups_inventory to load vars for managed_node3 29922 1726853694.39763: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853694.39777: Calling all_plugins_play to load vars for managed_node3 29922 1726853694.39780: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853694.39783: Calling groups_plugins_play to load vars for managed_node3 29922 1726853694.40779: WORKER PROCESS EXITING 29922 1726853694.43236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853694.45861: done with get_vars() 29922 1726853694.45899: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:236 Friday 20 September 2024 13:34:54 -0400 (0:00:00.097) 0:00:43.389 ****** 29922 1726853694.46009: entering _queue_task() for managed_node3/include_tasks 29922 1726853694.46397: worker is 1 (out of 1 available) 29922 1726853694.46415: exiting _queue_task() for managed_node3/include_tasks 29922 1726853694.46427: done queuing things up, now waiting for results queue to drain 29922 1726853694.46428: waiting for pending results... 29922 1726853694.46928: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 29922 1726853694.47378: in run() - task 02083763-bbaf-51d4-513b-0000000000ab 29922 1726853694.47382: variable 'ansible_search_path' from source: unknown 29922 1726853694.47385: calling self._execute() 29922 1726853694.47440: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853694.47876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853694.47880: variable 'omit' from source: magic vars 29922 1726853694.48281: variable 'ansible_distribution_major_version' from source: facts 29922 1726853694.48299: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853694.48680: _execute() done 29922 1726853694.48684: dumping result to json 29922 1726853694.48686: done dumping result, returning 29922 1726853694.48689: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [02083763-bbaf-51d4-513b-0000000000ab] 29922 1726853694.48691: sending task result for task 02083763-bbaf-51d4-513b-0000000000ab 29922 1726853694.48762: done sending task result for task 02083763-bbaf-51d4-513b-0000000000ab 29922 1726853694.48765: WORKER PROCESS EXITING 29922 1726853694.48800: no more pending results, returning what we have 29922 1726853694.48805: in VariableManager get_vars() 29922 1726853694.48844: Calling all_inventory to load vars for managed_node3 29922 1726853694.48847: Calling groups_inventory to load vars for managed_node3 29922 1726853694.48851: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853694.48868: Calling all_plugins_play to load vars for managed_node3 29922 1726853694.48874: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853694.48878: Calling groups_plugins_play to load vars for managed_node3 29922 1726853694.51729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853694.55226: done with get_vars() 29922 1726853694.55258: variable 'ansible_search_path' from source: unknown 29922 1726853694.55279: we have included files to process 29922 1726853694.55281: generating all_blocks data 29922 1726853694.55282: done generating all_blocks data 29922 1726853694.55287: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 29922 1726853694.55288: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 29922 1726853694.55291: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 29922 1726853694.56080: done processing included file 29922 1726853694.56083: iterating over new_blocks loaded from include file 29922 1726853694.56084: in VariableManager get_vars() 29922 1726853694.56096: done with get_vars() 29922 1726853694.56098: filtering new block on tags 29922 1726853694.56115: done filtering new block on tags 29922 1726853694.56117: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 29922 1726853694.56123: extending task lists for all hosts with included blocks 29922 1726853694.56788: done extending task lists 29922 1726853694.56790: done processing included files 29922 1726853694.56791: results queue empty 29922 1726853694.56791: checking for any_errors_fatal 29922 1726853694.56795: done checking for any_errors_fatal 29922 1726853694.56795: checking for max_fail_percentage 29922 1726853694.56797: done checking for max_fail_percentage 29922 1726853694.56798: checking to see if all hosts have failed and the running result is not ok 29922 1726853694.56798: done checking to see if all hosts have failed 29922 1726853694.56799: getting the remaining hosts for this loop 29922 1726853694.56800: done getting the remaining hosts for this loop 29922 1726853694.56803: getting the next task for host managed_node3 29922 1726853694.56807: done getting next task for host managed_node3 29922 1726853694.56809: ^ task is: TASK: Check routes and DNS 29922 1726853694.56811: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853694.56814: getting variables 29922 1726853694.56815: in VariableManager get_vars() 29922 1726853694.56824: Calling all_inventory to load vars for managed_node3 29922 1726853694.56826: Calling groups_inventory to load vars for managed_node3 29922 1726853694.56829: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853694.56835: Calling all_plugins_play to load vars for managed_node3 29922 1726853694.56837: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853694.56840: Calling groups_plugins_play to load vars for managed_node3 29922 1726853694.58816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853694.60843: done with get_vars() 29922 1726853694.60873: done getting variables 29922 1726853694.60915: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 13:34:54 -0400 (0:00:00.149) 0:00:43.538 ****** 29922 1726853694.60941: entering _queue_task() for managed_node3/shell 29922 1726853694.61488: worker is 1 (out of 1 available) 29922 1726853694.61498: exiting _queue_task() for managed_node3/shell 29922 1726853694.61508: done queuing things up, now waiting for results queue to drain 29922 1726853694.61509: waiting for pending results... 29922 1726853694.61625: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 29922 1726853694.61761: in run() - task 02083763-bbaf-51d4-513b-0000000006f6 29922 1726853694.61784: variable 'ansible_search_path' from source: unknown 29922 1726853694.61791: variable 'ansible_search_path' from source: unknown 29922 1726853694.61828: calling self._execute() 29922 1726853694.61937: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853694.61949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853694.61970: variable 'omit' from source: magic vars 29922 1726853694.62360: variable 'ansible_distribution_major_version' from source: facts 29922 1726853694.62379: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853694.62393: variable 'omit' from source: magic vars 29922 1726853694.62435: variable 'omit' from source: magic vars 29922 1726853694.62479: variable 'omit' from source: magic vars 29922 1726853694.62526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853694.62569: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853694.62596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853694.62621: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853694.62638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853694.62676: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853694.62685: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853694.62692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853694.62798: Set connection var ansible_connection to ssh 29922 1726853694.62810: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853694.62825: Set connection var ansible_shell_executable to /bin/sh 29922 1726853694.62837: Set connection var ansible_pipelining to False 29922 1726853694.62846: Set connection var ansible_timeout to 10 29922 1726853694.62853: Set connection var ansible_shell_type to sh 29922 1726853694.62884: variable 'ansible_shell_executable' from source: unknown 29922 1726853694.62892: variable 'ansible_connection' from source: unknown 29922 1726853694.62899: variable 'ansible_module_compression' from source: unknown 29922 1726853694.62905: variable 'ansible_shell_type' from source: unknown 29922 1726853694.62911: variable 'ansible_shell_executable' from source: unknown 29922 1726853694.62917: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853694.62924: variable 'ansible_pipelining' from source: unknown 29922 1726853694.62934: variable 'ansible_timeout' from source: unknown 29922 1726853694.63044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853694.63093: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853694.63110: variable 'omit' from source: magic vars 29922 1726853694.63120: starting attempt loop 29922 1726853694.63126: running the handler 29922 1726853694.63139: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853694.63168: _low_level_execute_command(): starting 29922 1726853694.63184: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853694.63927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853694.63993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853694.64054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853694.64085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853694.64101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853694.64184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853694.65877: stdout chunk (state=3): >>>/root <<< 29922 1726853694.66051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853694.66058: stdout chunk (state=3): >>><<< 29922 1726853694.66060: stderr chunk (state=3): >>><<< 29922 1726853694.66092: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853694.66206: _low_level_execute_command(): starting 29922 1726853694.66210: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853694.6610372-31964-148675480473406 `" && echo ansible-tmp-1726853694.6610372-31964-148675480473406="` echo /root/.ansible/tmp/ansible-tmp-1726853694.6610372-31964-148675480473406 `" ) && sleep 0' 29922 1726853694.67177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853694.67289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853694.67319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853694.67411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853694.69343: stdout chunk (state=3): >>>ansible-tmp-1726853694.6610372-31964-148675480473406=/root/.ansible/tmp/ansible-tmp-1726853694.6610372-31964-148675480473406 <<< 29922 1726853694.69526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853694.69529: stderr chunk (state=3): >>><<< 29922 1726853694.69532: stdout chunk (state=3): >>><<< 29922 1726853694.69557: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853694.6610372-31964-148675480473406=/root/.ansible/tmp/ansible-tmp-1726853694.6610372-31964-148675480473406 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853694.69589: variable 'ansible_module_compression' from source: unknown 29922 1726853694.69638: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29922 1726853694.69715: variable 'ansible_facts' from source: unknown 29922 1726853694.69843: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853694.6610372-31964-148675480473406/AnsiballZ_command.py 29922 1726853694.69992: Sending initial data 29922 1726853694.70050: Sent initial data (156 bytes) 29922 1726853694.70588: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853694.70598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853694.70702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853694.70908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853694.70960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853694.72596: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853694.72653: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29922 1726853694.72712: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp7blemkad /root/.ansible/tmp/ansible-tmp-1726853694.6610372-31964-148675480473406/AnsiballZ_command.py <<< 29922 1726853694.72716: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853694.6610372-31964-148675480473406/AnsiballZ_command.py" <<< 29922 1726853694.72803: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp7blemkad" to remote "/root/.ansible/tmp/ansible-tmp-1726853694.6610372-31964-148675480473406/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853694.6610372-31964-148675480473406/AnsiballZ_command.py" <<< 29922 1726853694.74582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853694.74586: stderr chunk (state=3): >>><<< 29922 1726853694.74589: stdout chunk (state=3): >>><<< 29922 1726853694.74591: done transferring module to remote 29922 1726853694.74593: _low_level_execute_command(): starting 29922 1726853694.74595: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853694.6610372-31964-148675480473406/ /root/.ansible/tmp/ansible-tmp-1726853694.6610372-31964-148675480473406/AnsiballZ_command.py && sleep 0' 29922 1726853694.75894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853694.75908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853694.75919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853694.75973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853694.75987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853694.76187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853694.77925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853694.77963: stderr chunk (state=3): >>><<< 29922 1726853694.77977: stdout chunk (state=3): >>><<< 29922 1726853694.78002: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853694.78017: _low_level_execute_command(): starting 29922 1726853694.78086: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853694.6610372-31964-148675480473406/AnsiballZ_command.py && sleep 0' 29922 1726853694.79597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853694.79617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853694.79642: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853694.79745: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853694.96303: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:2a:53:36:f0:e9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2785sec preferred_lft 2785sec\n inet6 fe80::102a:53ff:fe36:f0e9/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\n30: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 6e:7c:f1:8e:1c:81 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:34:54.951523", "end": "2024-09-20 13:34:54.960377", "delta": "0:00:00.008854", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853694.97842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853694.97916: stderr chunk (state=3): >>><<< 29922 1726853694.97926: stdout chunk (state=3): >>><<< 29922 1726853694.98001: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:2a:53:36:f0:e9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2785sec preferred_lft 2785sec\n inet6 fe80::102a:53ff:fe36:f0e9/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\n30: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 6e:7c:f1:8e:1c:81 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:34:54.951523", "end": "2024-09-20 13:34:54.960377", "delta": "0:00:00.008854", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853694.98199: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853694.6610372-31964-148675480473406/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853694.98209: _low_level_execute_command(): starting 29922 1726853694.98212: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853694.6610372-31964-148675480473406/ > /dev/null 2>&1 && sleep 0' 29922 1726853694.99026: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853694.99033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853694.99049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853694.99055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853694.99074: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853694.99078: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853694.99095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 29922 1726853694.99102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853694.99173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853694.99180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853694.99195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853694.99280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853695.01364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853695.01368: stdout chunk (state=3): >>><<< 29922 1726853695.01373: stderr chunk (state=3): >>><<< 29922 1726853695.01391: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853695.01443: handler run complete 29922 1726853695.01447: Evaluated conditional (False): False 29922 1726853695.01454: attempt loop complete, returning result 29922 1726853695.01461: _execute() done 29922 1726853695.01470: dumping result to json 29922 1726853695.01552: done dumping result, returning 29922 1726853695.01556: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [02083763-bbaf-51d4-513b-0000000006f6] 29922 1726853695.01559: sending task result for task 02083763-bbaf-51d4-513b-0000000006f6 ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008854", "end": "2024-09-20 13:34:54.960377", "rc": 0, "start": "2024-09-20 13:34:54.951523" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:2a:53:36:f0:e9 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2785sec preferred_lft 2785sec inet6 fe80::102a:53ff:fe36:f0e9/64 scope link noprefixroute valid_lft forever preferred_lft forever 30: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000 link/ether 6e:7c:f1:8e:1c:81 brd ff:ff:ff:ff:ff:ff inet 192.0.2.72/31 scope global noprefixroute rpltstbr valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 29922 1726853695.01845: no more pending results, returning what we have 29922 1726853695.01850: results queue empty 29922 1726853695.01851: checking for any_errors_fatal 29922 1726853695.01853: done checking for any_errors_fatal 29922 1726853695.01853: checking for max_fail_percentage 29922 1726853695.01855: done checking for max_fail_percentage 29922 1726853695.01856: checking to see if all hosts have failed and the running result is not ok 29922 1726853695.01857: done checking to see if all hosts have failed 29922 1726853695.01858: getting the remaining hosts for this loop 29922 1726853695.01860: done getting the remaining hosts for this loop 29922 1726853695.01864: getting the next task for host managed_node3 29922 1726853695.01870: done getting next task for host managed_node3 29922 1726853695.01875: ^ task is: TASK: Verify DNS and network connectivity 29922 1726853695.01878: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853695.01883: getting variables 29922 1726853695.01885: in VariableManager get_vars() 29922 1726853695.01916: Calling all_inventory to load vars for managed_node3 29922 1726853695.01918: Calling groups_inventory to load vars for managed_node3 29922 1726853695.01921: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853695.01934: Calling all_plugins_play to load vars for managed_node3 29922 1726853695.01937: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853695.01940: Calling groups_plugins_play to load vars for managed_node3 29922 1726853695.02484: done sending task result for task 02083763-bbaf-51d4-513b-0000000006f6 29922 1726853695.02488: WORKER PROCESS EXITING 29922 1726853695.03779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853695.06555: done with get_vars() 29922 1726853695.06605: done getting variables 29922 1726853695.06667: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 13:34:55 -0400 (0:00:00.457) 0:00:43.996 ****** 29922 1726853695.06706: entering _queue_task() for managed_node3/shell 29922 1726853695.07060: worker is 1 (out of 1 available) 29922 1726853695.07075: exiting _queue_task() for managed_node3/shell 29922 1726853695.07088: done queuing things up, now waiting for results queue to drain 29922 1726853695.07090: waiting for pending results... 29922 1726853695.07401: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 29922 1726853695.07504: in run() - task 02083763-bbaf-51d4-513b-0000000006f7 29922 1726853695.07523: variable 'ansible_search_path' from source: unknown 29922 1726853695.07532: variable 'ansible_search_path' from source: unknown 29922 1726853695.07575: calling self._execute() 29922 1726853695.07690: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853695.07704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853695.07726: variable 'omit' from source: magic vars 29922 1726853695.08139: variable 'ansible_distribution_major_version' from source: facts 29922 1726853695.08165: Evaluated conditional (ansible_distribution_major_version != '6'): True 29922 1726853695.08313: variable 'ansible_facts' from source: unknown 29922 1726853695.09349: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 29922 1726853695.09363: variable 'omit' from source: magic vars 29922 1726853695.09427: variable 'omit' from source: magic vars 29922 1726853695.09451: variable 'omit' from source: magic vars 29922 1726853695.09496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29922 1726853695.09577: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29922 1726853695.09580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29922 1726853695.09587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853695.09602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29922 1726853695.09633: variable 'inventory_hostname' from source: host vars for 'managed_node3' 29922 1726853695.09646: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853695.09655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853695.09762: Set connection var ansible_connection to ssh 29922 1726853695.09778: Set connection var ansible_module_compression to ZIP_DEFLATED 29922 1726853695.09862: Set connection var ansible_shell_executable to /bin/sh 29922 1726853695.09868: Set connection var ansible_pipelining to False 29922 1726853695.09874: Set connection var ansible_timeout to 10 29922 1726853695.09877: Set connection var ansible_shell_type to sh 29922 1726853695.09879: variable 'ansible_shell_executable' from source: unknown 29922 1726853695.09881: variable 'ansible_connection' from source: unknown 29922 1726853695.09883: variable 'ansible_module_compression' from source: unknown 29922 1726853695.09885: variable 'ansible_shell_type' from source: unknown 29922 1726853695.09888: variable 'ansible_shell_executable' from source: unknown 29922 1726853695.09890: variable 'ansible_host' from source: host vars for 'managed_node3' 29922 1726853695.09892: variable 'ansible_pipelining' from source: unknown 29922 1726853695.09894: variable 'ansible_timeout' from source: unknown 29922 1726853695.09896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 29922 1726853695.10096: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853695.10112: variable 'omit' from source: magic vars 29922 1726853695.10140: starting attempt loop 29922 1726853695.10148: running the handler 29922 1726853695.10164: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 29922 1726853695.10299: _low_level_execute_command(): starting 29922 1726853695.10303: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29922 1726853695.10990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853695.11067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853695.11093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853695.11190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853695.12884: stdout chunk (state=3): >>>/root <<< 29922 1726853695.12989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853695.13048: stderr chunk (state=3): >>><<< 29922 1726853695.13063: stdout chunk (state=3): >>><<< 29922 1726853695.13094: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853695.13212: _low_level_execute_command(): starting 29922 1726853695.13216: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853695.1310575-31996-30404870584611 `" && echo ansible-tmp-1726853695.1310575-31996-30404870584611="` echo /root/.ansible/tmp/ansible-tmp-1726853695.1310575-31996-30404870584611 `" ) && sleep 0' 29922 1726853695.14430: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853695.14441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853695.14533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853695.16485: stdout chunk (state=3): >>>ansible-tmp-1726853695.1310575-31996-30404870584611=/root/.ansible/tmp/ansible-tmp-1726853695.1310575-31996-30404870584611 <<< 29922 1726853695.16667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853695.16670: stdout chunk (state=3): >>><<< 29922 1726853695.16674: stderr chunk (state=3): >>><<< 29922 1726853695.16693: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853695.1310575-31996-30404870584611=/root/.ansible/tmp/ansible-tmp-1726853695.1310575-31996-30404870584611 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853695.16768: variable 'ansible_module_compression' from source: unknown 29922 1726853695.16833: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29922avbm1zsd/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29922 1726853695.16895: variable 'ansible_facts' from source: unknown 29922 1726853695.17004: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853695.1310575-31996-30404870584611/AnsiballZ_command.py 29922 1726853695.17154: Sending initial data 29922 1726853695.17263: Sent initial data (155 bytes) 29922 1726853695.17861: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853695.17958: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 29922 1726853695.18019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853695.18033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853695.18188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853695.20020: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29922 1726853695.20024: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853695.1310575-31996-30404870584611/AnsiballZ_command.py" <<< 29922 1726853695.20027: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp6iptkywi /root/.ansible/tmp/ansible-tmp-1726853695.1310575-31996-30404870584611/AnsiballZ_command.py <<< 29922 1726853695.20030: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29922avbm1zsd/tmp6iptkywi" to remote "/root/.ansible/tmp/ansible-tmp-1726853695.1310575-31996-30404870584611/AnsiballZ_command.py" <<< 29922 1726853695.20032: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853695.1310575-31996-30404870584611/AnsiballZ_command.py" <<< 29922 1726853695.21430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853695.21443: stderr chunk (state=3): >>><<< 29922 1726853695.21445: stdout chunk (state=3): >>><<< 29922 1726853695.21467: done transferring module to remote 29922 1726853695.21482: _low_level_execute_command(): starting 29922 1726853695.21487: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853695.1310575-31996-30404870584611/ /root/.ansible/tmp/ansible-tmp-1726853695.1310575-31996-30404870584611/AnsiballZ_command.py && sleep 0' 29922 1726853695.22580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853695.22687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853695.22749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853695.22753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853695.22756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853695.22758: stderr chunk (state=3): >>>debug2: match not found <<< 29922 1726853695.22760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853695.22763: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29922 1726853695.22765: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 29922 1726853695.22767: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29922 1726853695.22773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853695.22783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29922 1726853695.22795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29922 1726853695.22802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 29922 1726853695.22810: stderr chunk (state=3): >>>debug2: match found <<< 29922 1726853695.22858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853695.22997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853695.23190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853695.23273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853695.25104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853695.25249: stderr chunk (state=3): >>><<< 29922 1726853695.25253: stdout chunk (state=3): >>><<< 29922 1726853695.25255: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853695.25257: _low_level_execute_command(): starting 29922 1726853695.25260: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853695.1310575-31996-30404870584611/AnsiballZ_command.py && sleep 0' 29922 1726853695.26275: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853695.26289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853695.26302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853695.26366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853695.26506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853695.26578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853695.50797: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 16333 0 --:--:-- --:--:-- --:--:-- 16944\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 15190 0 --:--:-- --:--:-- --:--:-- 15315", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:34:55.419185", "end": "2024-09-20 13:34:55.503463", "delta": "0:00:00.084278", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29922 1726853695.52206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 29922 1726853695.52341: stderr chunk (state=3): >>><<< 29922 1726853695.52344: stdout chunk (state=3): >>><<< 29922 1726853695.52348: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 16333 0 --:--:-- --:--:-- --:--:-- 16944\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 15190 0 --:--:-- --:--:-- --:--:-- 15315", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:34:55.419185", "end": "2024-09-20 13:34:55.503463", "delta": "0:00:00.084278", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 29922 1726853695.52356: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853695.1310575-31996-30404870584611/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29922 1726853695.52359: _low_level_execute_command(): starting 29922 1726853695.52362: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853695.1310575-31996-30404870584611/ > /dev/null 2>&1 && sleep 0' 29922 1726853695.53613: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29922 1726853695.53622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29922 1726853695.53821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29922 1726853695.53926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 29922 1726853695.53997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29922 1726853695.54048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29922 1726853695.55946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29922 1726853695.55951: stdout chunk (state=3): >>><<< 29922 1726853695.55956: stderr chunk (state=3): >>><<< 29922 1726853695.55977: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29922 1726853695.55984: handler run complete 29922 1726853695.56009: Evaluated conditional (False): False 29922 1726853695.56018: attempt loop complete, returning result 29922 1726853695.56021: _execute() done 29922 1726853695.56024: dumping result to json 29922 1726853695.56030: done dumping result, returning 29922 1726853695.56039: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [02083763-bbaf-51d4-513b-0000000006f7] 29922 1726853695.56043: sending task result for task 02083763-bbaf-51d4-513b-0000000006f7 29922 1726853695.56268: done sending task result for task 02083763-bbaf-51d4-513b-0000000006f7 29922 1726853695.56274: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.084278", "end": "2024-09-20 13:34:55.503463", "rc": 0, "start": "2024-09-20 13:34:55.419185" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 16333 0 --:--:-- --:--:-- --:--:-- 16944 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 15190 0 --:--:-- --:--:-- --:--:-- 15315 29922 1726853695.56347: no more pending results, returning what we have 29922 1726853695.56351: results queue empty 29922 1726853695.56352: checking for any_errors_fatal 29922 1726853695.56363: done checking for any_errors_fatal 29922 1726853695.56364: checking for max_fail_percentage 29922 1726853695.56365: done checking for max_fail_percentage 29922 1726853695.56367: checking to see if all hosts have failed and the running result is not ok 29922 1726853695.56367: done checking to see if all hosts have failed 29922 1726853695.56368: getting the remaining hosts for this loop 29922 1726853695.56370: done getting the remaining hosts for this loop 29922 1726853695.56375: getting the next task for host managed_node3 29922 1726853695.56389: done getting next task for host managed_node3 29922 1726853695.56396: ^ task is: TASK: meta (flush_handlers) 29922 1726853695.56399: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853695.56403: getting variables 29922 1726853695.56405: in VariableManager get_vars() 29922 1726853695.56436: Calling all_inventory to load vars for managed_node3 29922 1726853695.56438: Calling groups_inventory to load vars for managed_node3 29922 1726853695.56442: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853695.56455: Calling all_plugins_play to load vars for managed_node3 29922 1726853695.56458: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853695.56462: Calling groups_plugins_play to load vars for managed_node3 29922 1726853695.60358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853695.64247: done with get_vars() 29922 1726853695.64487: done getting variables 29922 1726853695.64564: in VariableManager get_vars() 29922 1726853695.64577: Calling all_inventory to load vars for managed_node3 29922 1726853695.64579: Calling groups_inventory to load vars for managed_node3 29922 1726853695.64582: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853695.64587: Calling all_plugins_play to load vars for managed_node3 29922 1726853695.64589: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853695.64592: Calling groups_plugins_play to load vars for managed_node3 29922 1726853695.67321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853695.80267: done with get_vars() 29922 1726853695.80503: done queuing things up, now waiting for results queue to drain 29922 1726853695.80505: results queue empty 29922 1726853695.80506: checking for any_errors_fatal 29922 1726853695.80510: done checking for any_errors_fatal 29922 1726853695.80511: checking for max_fail_percentage 29922 1726853695.80512: done checking for max_fail_percentage 29922 1726853695.80513: checking to see if all hosts have failed and the running result is not ok 29922 1726853695.80514: done checking to see if all hosts have failed 29922 1726853695.80514: getting the remaining hosts for this loop 29922 1726853695.80515: done getting the remaining hosts for this loop 29922 1726853695.80518: getting the next task for host managed_node3 29922 1726853695.80522: done getting next task for host managed_node3 29922 1726853695.80523: ^ task is: TASK: meta (flush_handlers) 29922 1726853695.80525: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853695.80527: getting variables 29922 1726853695.80528: in VariableManager get_vars() 29922 1726853695.80538: Calling all_inventory to load vars for managed_node3 29922 1726853695.80540: Calling groups_inventory to load vars for managed_node3 29922 1726853695.80543: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853695.80548: Calling all_plugins_play to load vars for managed_node3 29922 1726853695.80550: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853695.80554: Calling groups_plugins_play to load vars for managed_node3 29922 1726853695.83081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853695.86540: done with get_vars() 29922 1726853695.86572: done getting variables 29922 1726853695.86826: in VariableManager get_vars() 29922 1726853695.86836: Calling all_inventory to load vars for managed_node3 29922 1726853695.86839: Calling groups_inventory to load vars for managed_node3 29922 1726853695.86842: Calling all_plugins_inventory to load vars for managed_node3 29922 1726853695.86847: Calling all_plugins_play to load vars for managed_node3 29922 1726853695.86849: Calling groups_plugins_inventory to load vars for managed_node3 29922 1726853695.86852: Calling groups_plugins_play to load vars for managed_node3 29922 1726853695.89331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29922 1726853695.92333: done with get_vars() 29922 1726853695.92369: done queuing things up, now waiting for results queue to drain 29922 1726853695.92575: results queue empty 29922 1726853695.92577: checking for any_errors_fatal 29922 1726853695.92578: done checking for any_errors_fatal 29922 1726853695.92579: checking for max_fail_percentage 29922 1726853695.92580: done checking for max_fail_percentage 29922 1726853695.92580: checking to see if all hosts have failed and the running result is not ok 29922 1726853695.92581: done checking to see if all hosts have failed 29922 1726853695.92582: getting the remaining hosts for this loop 29922 1726853695.92583: done getting the remaining hosts for this loop 29922 1726853695.92586: getting the next task for host managed_node3 29922 1726853695.92590: done getting next task for host managed_node3 29922 1726853695.92590: ^ task is: None 29922 1726853695.92592: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29922 1726853695.92593: done queuing things up, now waiting for results queue to drain 29922 1726853695.92594: results queue empty 29922 1726853695.92594: checking for any_errors_fatal 29922 1726853695.92595: done checking for any_errors_fatal 29922 1726853695.92596: checking for max_fail_percentage 29922 1726853695.92597: done checking for max_fail_percentage 29922 1726853695.92597: checking to see if all hosts have failed and the running result is not ok 29922 1726853695.92598: done checking to see if all hosts have failed 29922 1726853695.92599: getting the next task for host managed_node3 29922 1726853695.92601: done getting next task for host managed_node3 29922 1726853695.92602: ^ task is: None 29922 1726853695.92603: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=87 changed=5 unreachable=0 failed=0 skipped=73 rescued=0 ignored=1 Friday 20 September 2024 13:34:55 -0400 (0:00:00.861) 0:00:44.858 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.23s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.88s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.87s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.73s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.55s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather current interface info ------------------------------------------- 1.38s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Gathering Facts --------------------------------------------------------- 1.24s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.20s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Install iproute --------------------------------------------------------- 1.18s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gathering Facts --------------------------------------------------------- 1.18s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:3 Create veth interface ethtest0 ------------------------------------------ 1.12s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.08s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.08s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.01s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:227 fedora.linux_system_roles.network : Check which packages are installed --- 0.93s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Check if system is ostree ----------------------------------------------- 0.89s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.88s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Verify DNS and network connectivity ------------------------------------- 0.86s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Check which packages are installed --- 0.82s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.77s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 29922 1726853695.92940: RUNNING CLEANUP